Image processing method and related electronic device
12143732 ยท 2024-11-12
Assignee
Inventors
Cpc classification
H04N1/6086
ELECTRICITY
H04N23/88
ELECTRICITY
International classification
Abstract
An image processing method and a related electronic device. The method includes: starting a camera in response to a first operation; acquiring, by the camera, a first image in response to a second operation; performing first image processing on the first image to obtain a second image in a second color space; obtaining a second parameter based on a first parameter of a first ambient light source, where the first parameter includes a white balance parameter of the first ambient light source, the second parameter is used to identify light source information of the first ambient light source in a third color space, and the third color space is different from the first color space and the second color space; performing second image processing on the second image based on the second parameter to obtain a target image; and storing the target image.
Claims
1. An image processing method, comprising: starting a camera of an electronic device in response to a first operation; acquiring a first image under a first ambient light source in response to a second operation, wherein the first image is an image in a first color space, and the first ambient light source is at least one ambient light source for acquisition of the first image; adjusting the first image based on a formula
2. The method of claim 1, wherein obtaining the second parameter based on the first parameter of the first ambient light source comprises: calculating a degree D of chromatic adaptation based on the first parameter; calculating, based on the first parameter, LMS values of a white point corresponding to the first ambient light source in the third color space; calculating a gain matrix based on a formula
3. The method of claim 2, wherein calculating the degree D of chromatic adaptation based on the first parameter comprises: calculating a first variable based on a formula P.sub.xy={square root over ((x.sub.1x.sub.N).sup.2+(y.sub.1y.sub.N).sup.2)}, wherein P.sub.xy is the first variable, x.sub.N is an x value of the reference white point in the xy chromaticity diagram, and y.sub.N is a y value of the reference white point in the xy chromaticity diagram; calculating a second variable based on a formula D.sub.c=1e.sup.5.3.Math.P.sub.xy, wherein D.sub.c is the second variable; and obtaining the degree D of chromatic adaptation based on a formula D=0.96.Math.D.sub.c((1e(.sup.4.28.Math.log La)).sup.406.51)+1, wherein La is a lighting value of the first ambient light source.
4. The method of claim 2, wherein calculating, based on the first parameter, LMS values of the white point corresponding to the first ambient light source in the third color space comprises: calculating a third variable based on a formula
5. The method of claim 1, wherein performing second image processing on the second image based on the second parameter to obtain the target image comprises: performing third image processing on the second image to obtain a third image; performing chromatic adaptation processing on the third image based on a formula
6. The method of claim 5, wherein performing third image processing on the second image to obtain the third image comprises: converting the second image from the second color space to an XYZ color space based on a formula
7. The method of claim 5, wherein performing fourth image processing on the fourth image to obtain the target image comprises: converting the fourth image from the third color space to an XYZ color space based on a formula
8. The method of claim 1, wherein performing second image processing on the second image based on the second parameter to obtain the target image comprises: performing fifth image processing on the second image based on a formula
9. An electronic device, comprising: a camera; one or more processors coupled to the camera; and a non-transitory memory coupled to the processor and configured to store instructions that, when executed by the processor, causes the electronic device to be configured to: start the camera in response to a first operation; acquire, by the camera, a first image under a first ambient light source in response to a second operation, wherein the first image is an image in a first color space, and the first ambient light source is at least one ambient light source for acquisition of the first image; adjust the first image based on a formula
10. The electronic device of claim 9, wherein the first color space is a RAW space, the second color space is an sRGB color space, and the third color space is an LMS color space or an XYZ color space, and wherein the first parameter comprises an x.sub.1 value and a y.sub.1 value of the first ambient light source in an xy chromaticity diagram.
11. The electronic device of claim 9, wherein obtaining the second parameter based on the first parameter of the first ambient light source comprises: calculating a degree D of chromatic adaptation based on the first parameter; calculating, based on the first parameter, LMS values of a white point corresponding to the first ambient light source in the third color space; calculating a gain matrix based on a formula
12. The electronic device of claim 11, wherein calculating the degree D of chromatic adaptation based on the first parameter comprises: calculating a first variable based on a formula P.sub.xy={square root over ((x.sub.1x.sub.N).sup.2+(y.sub.1y.sub.N).sup.2)}, wherein P.sub.xy is the first variable, x.sub.N is an x value of the reference white point in an xy chromaticity diagram, and y.sub.N is a y value of the reference white point in the xy chromaticity diagram; calculating a second variable based on a formula D.sub.c=1e.sup.5.3.Math.P.sup.
13. The electronic device of claim 11, wherein calculating, based on the first parameter, LMS values of the white point corresponding to the first ambient light source in the third color space comprises: calculating a third variable based on a formula
14. The electronic device of claim 9, wherein performing second image processing on the second image based on the second parameter to obtain the target image comprises: performing third image processing on the second image to obtain a third image; performing chromatic adaptation processing on the third image based on a formula
15. The electronic device of claim 14, wherein performing third image processing on the second image to obtain the third image comprises: converting the second image from the second color space to an XYZ color space based on a formula
16. The electronic device of claim 14, wherein performing fourth image processing on the fourth image to obtain the target image comprises: converting the fourth image from the third color space to an XYZ color space based on a formula
17. The electronic device of claim 9, wherein performing second image processing on the second image based on the second parameter to obtain the target image comprises: performing fifth image processing on the second image based on a formula
18. A non-transitory computer-readable storage medium comprising instructions that, when executed by a processor of an electronic device, cause the electronic device to be configured to: start a camera of the electronic device in response to a first operation; acquire, by the camera, a first image under a first ambient light source in response to a second operation, wherein the first image is an image in a first color space, and the first ambient light source is at least one ambient light source for acquisition of the first image; adjust the first image based on a formula
19. The non-transitory computer-readable storage medium of claim 18, wherein obtaining the second parameter based on the first parameter of the first ambient light source comprises: calculating a degree D of chromatic adaptation based on the first parameter; calculating, based on the first parameter, LMS values of a white point corresponding to the first ambient light source in the third color space; calculating a gain matrix based on a formula
20. The non-transitory computer-readable storage medium of claim 18, wherein performing second image processing on the second image based on the second parameter to obtain the target image comprises: performing third image processing on the second image to obtain a third image; performing chromatic adaptation processing on the third image based on a formula
Description
BRIEF DESCRIPTION OF DRAWINGS
(1)
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
(12)
(13)
(14)
(15)
(16)
DESCRIPTION OF EMBODIMENTS
(17) The following clearly and completely describes the technical solutions in the embodiments of this application with reference to the accompanying drawings in the embodiments of this application. Apparently, the described embodiments are merely some but not all of the embodiments of this application. The embodiment mentioned in the specification means that a specific feature, structure, or characteristic described with reference to the embodiment may be included in at least one embodiment of this application. The phrase appearing in various positions in the specification does not necessarily mean a same embodiment, and is neither an independent or alternative embodiment mutually exclusive with other embodiments. It may be explicitly and implicitly understood by a person skilled in the art that the embodiments described in the specification may be combined with other embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments of this application without creative efforts fall within the protection scope of this disclosure.
(18) In the specification, claims, and accompanying drawings of this application, the terms first, second, third, and so on are intended to distinguish between different objects but do not indicate a particular order of the objects. In addition, the terms comprise, include, and any variants thereof are intended to cover a non-exclusive inclusion, for example, including a list of steps or units; or optionally, further including steps or units that are not listed; or optionally; further including other steps or units inherent to such a process, method, product, or device.
(19) The accompanying drawings show merely some but not all content related to this application. Before example embodiments are described in more details, it should be noted that some example embodiments are described as processing or methods depicted as flowcharts. Although the flowcharts describe operations (or steps) as sequential processing, many of the operations may be implemented in parallel, concurrently, or simultaneously. In addition, a sequence of the operations may be rearranged. The processing may be terminated when the operations of the processing are completed. However, there may be further additional steps not included in the accompanying drawings. The processing may correspond to a method, a function, a procedure, a subroutine, a sub-program, or the like.
(20) The terms component, module, system, unit, and so on used in the specification are used to indicate a computer-related entity, hardware, firmware, a combination of hardware and software, software, or software in execution. For example, the unit may be, but is not limited to, a process running on a processor, a processor, an object, an executable file, an execution thread, or a program, and/or distributed between two or more computers. In addition, these units may be executed from various computer-readable media in which various data structures are stored. For example, the unit may perform communication by using a local process and/or a remote process based on a signal with one or more data packets (for example, coming from second unit data for interacting with a local system, a distributed system, and/or another unit in a network; or for example, an internet that interacts with another system by using a signal).
(21) Some technical terms included in the embodiments of this application are explained below (1) Planckian locus: An object that performs neither reflection nor complete projection under the action of radiation but can absorb all radiation falling on the object is referred to as a black body or a perfect radiator. When the black body is continuously heated, a maximum value of a relative spectral power distribution of the black body moves toward a short-wave direction, and a corresponding light color changes in a sequence of red, yellow, white, and blue. An arc locus formed by the light color change corresponding to the black body in a chromaticity diagram at different temperatures is referred to as a black body locus or a Planckian locus. (2) Correlated color temperature (Correlated Colour Temperature, CCT): a temperature of a black body radiator closest to a color with the same brightness stimulus, which is represented by using a Kelvin temperature and is used to describe a metric of a color of light near a Planckian locus. Light sources other than a thermal radiation light source have linear spectra, and radiation characteristics of these light sources are quite different from those of a black body, Therefore, light colors of these light sources may not accurately fall on a black body locus in a chromaticity diagram. For these light sources, the CCT is usually used to describe the color characteristics of the light sources. (3) Chromaticity distance (D.sub.uv): a distance from a chromaticity value (u, v) of a test light source to a nearest point on a Planckian locus, where D.sub.uv represents information about a color offset (green or pink) of the chromaticity value (u, v) of the test light source and the Planckian locus and a direction. (4) Lighting value (Lighting Value, Lv): used to estimate ambient brightness, where a specific calculation formula is as follows:
(22)
where
(23) Exposure is an exposure time, Aperture is an aperture size, Iso is a light sensitivity, and Luma is an average value of Y of an image in an XYZ color space.
(24) Human eyes have color constancy. That is, when a color of light irradiating an object surface changes, people's perception of a color of the object surface still remains unchanged. A charge-coupled device (Charge-coupled Device, CCD) circuit or a CMOS circuit for converting an optical signal into an electrical signal in a camera cannot correct a color change of a light source like human eyes do. Therefore, a chromaticity value of a light source of a captured image needs to be estimated by using a white balance algorithm, and an image color needs to be adjusted by using the estimated chromaticity value of the light source, to eliminate impact of a color temperature of a light source in a photographing environment on the image color. A processed image color is not affected by a CCT of the light source in the photographing environment, thereby avoiding a problem of an image color deviation.
(25) For example,
(26)
(27) It can be learned from
(28) However, it is found in scientific researches that human eyes have a characteristic of incomplete chromatic adaptation, that is, human eyes cannot completely correct a color of an object due to impact of an ambient light source and ambient brightness. As a result, human eyes do not maintain color constancy in a plurality of light source environments, and during observation of an object, a visual effect of human eyes on the object is usually different from a real color of the object. Therefore, after a camera performs white balance processing on an image, although impact of factors such as a correlated color temperature of an ambient light source on a color of a photographed object is eliminated and an original color of the image is restored, a color of an image obtained through white balance processing may be inconsistent with a color really observed by human eyes due to the characteristic of incomplete chromatic adaptation of human eyes.
(29) For example, as shown in
(30) To resolve the foregoing problem that a color of an image obtained through white balance processing is inconsistent with a color of an object really observed by human eyes, the embodiments of this application propose an image chromatic adaptation processing method. The method includes: An electronic device calculates chromaticity information of a light source in a photographing environment by using a white balance algorithm, and performs white balance processing and color restoration processing on an image to obtain a processed image, where the processed image is an image under a reference light source (for example, a D65 light source). Then the electronic device calculates a chromatic adaptation conversion matrix based on the chromaticity information of the light source in the photographing environment and a chromatic adaptation algorithm; performs chromatic adaptation processing on the image in an LMS color space by using the chromatic adaptation conversion matrix to obtain an image obtained through chromatic adaptation processing; converts the image obtained through chromatic adaptation processing from the LMS color space to an XYZ color space; converts the image from the XYZ color space to an sRGB color space; and then performs a series of processing on the image to obtain a chromatically adapted image consistent with a visual effect of human eyes.
(31) The following describes a hardware structure of an electronic device included in the embodiments of this application with reference to
(32) The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communications module 150, a wireless communications module 160, an audio module 170, a speaker 170A, a telephone receiver 170B, a microphone 170C, an earphone jack 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display 194, a subscriber identification module (subscriber identification module, SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, a barometric pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, an optical proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like. It can be understood that the structure shown in this embodiment of the present invention does not constitute a specific limitation on the electronic device 100. In some other embodiments of this application, the electronic device 100 may include more or fewer components than shown in the figure, or some components may be combined, or some components may be split, or there may be a different component layout. The components shown in the figure may be implemented by hardware, software, or a combination of software and hardware.
(33) The processor 110 may include one or more processing units. For example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural-network processing unit (neural-network processing unit, NPU). Different processing units may be separate devices, or may be integrated into one or more processors.
(34) A wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communications module 150, the wireless communications module 160, the modem processor, the baseband processor, and the like.
(35) The antenna 1 and the antenna 2 are configured to transmit and receive electromagnetic wave signals. Each antenna in the electronic device 100 may be configured to cover one or more communication frequency bands. Different antennas may further support multiplexing so as to increase antenna utilization. For example, the antenna 1 may be also used as a diversity antenna of a wireless local area network. In some other embodiments, an antenna may be used in combination with a tuning switch.
(36) The mobile communications module 150 may provide wireless communication solutions including 2G, 3G, 4G, 5G, and the like which are applied to the electronic device 100. The mobile communications module 150 may include at least one filter, a switch, a power amplifier, a low noise amplifier (low noise amplifier, LNA), and the like. The mobile communications module 150 may receive an electromagnetic wave by using the antenna 1, perform processing such as filtering and amplification on the received electromagnetic wave, and transmit a processed electromagnetic wave to the modem processor for demodulation. The mobile communications module 150 may further amplify a signal modulated by the modem processor, and convert an amplified signal into an electromagnetic wave and radiate the electromagnetic wave by using the antenna 1. In some embodiments, at least some functional modules of the mobile communications module 150 may be disposed in the processor 110. In some embodiments, at least some functional modules of the mobile communications module 150 and at least some modules of the processor 110 may be disposed in a same device.
(37) The wireless communications module 160 may provide wireless communication solutions applied to the electronic device 100, including wireless local area network (wireless local area networks, WLAN) (for example, network), Bluetooth (Bluetooth, BT), BLE broadcast, global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication (near field communication, NFC) technology, and infrared (infrared, IR) technology. The wireless communications module 160 may be one or more devices that integrate at least one communications processing module. The wireless communications module 160 receives an electromagnetic wave by using the antenna 2, performs frequency modulation and filtering processing on an electromagnetic wave signal, and sends a processed signal to the processor 110. The wireless communications module 160 may further receive a to-be-sent signal from the processor 110, perform frequency modulation and amplification on the signal, and convert a processed signal into an electromagnetic wave and radiate the electromagnetic wave by using the antenna 2.
(38) The electronic device 100 implements a display function by using the GPU, the display 194, the application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is configured to perform mathematical and geometric calculation, and is used for graphics rendering. The processor 110 may include one or more GPUs that execute a program instruction to generate or change display information.
(39) The display 194 is configured to display an image, a video, and the like. The display 194 includes a display panel. The display panel may be a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (organic light-emitting diode, OLED), an active-matrix organic light-emitting diode (active-matrix organic light emitting diode, AMOLED), a flexible light-emitting diode (flex light-emitting diode, FLED), a mini LED, a micro LED, a micro OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 100 may include one or N displays 194, where N is a positive integer greater than 1.
(40) The electronic device 100 may implement a photographing function by using the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
(41) The ISP is configured to process data fed back by the camera 193. For example, during photographing, a shutter is opened, light is transmitted to a photosensitive element of the camera through a lens, an optical signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing, to convert the electrical signal into an image visible to a naked eye. The ISP may further optimize noise, brightness, and complexion of the image based on an algorithm. The ISP may further optimize parameters such as exposure and color temperature of a photographing scenario. In some embodiments, the ISP may be disposed in the camera 193.
(42) The digital signal processor is configured to process a digital signal. In addition to a digital image signal, the digital signal processor may further process another digital signal. For example, when the electronic device 100 selects a frequency, the digital signal processor is configured to perform Fourier transformation and the like on frequency energy.
(43) The NPU is a neural-network (neural-network, NN) computing processor. The NPU quickly processes input information with reference to a structure of a biological neural network, for example, with reference to a transfer mode between human brain neurons, and may further continuously perform self-learning. The NPU may implement applications such as intelligent cognition of the electronic device 100, for example, image recognition, facial recognition, speech recognition, and text comprehension.
(44) The electronic device 100 may implement audio functions, for example, music playing and recording, by using the audio module 170, the speaker 170A, the telephone receiver 170B, the microphone 170C, the earphone jack 170D, the application processor, and the like.
(45) The audio module 170 is configured to convert digital audio information into an analog audio signal for output, and is also configured to convert analog audio input into a digital audio signal. The audio module 170 may be further configured to encode and decode an audio signal. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 are disposed in the processor 110.
(46) The speaker 170A, also referred to as a loudspeaker, is configured to convert an audio electrical signal into a sound signal. The electronic device 100 may be used to listen to music or answer a hands-free call by using the speaker 170A.
(47) The telephone receiver 170B, also referred to as an earpiece, is configured to convert an audio electrical signal into a sound signal. When the electronic device 100 is used to answer a call or listen to voice information, the telephone receiver 170B may be placed close to a human ear to listen to a voice.
(48) The microphone 170C, also referred to as a mike or a mic, is configured to convert a sound signal into an electrical signal. When making a call or sending voice information, a user may move a mouth close to the microphone 170C and make a sound, to input a sound signal into the microphone 170C. At least one microphone 170C may be disposed in the electronic device 100. In some other embodiments, two microphones 170C may be disposed in the electronic device 100, to implement a noise reduction function, in addition to collecting a sound signal. In some other embodiments, three, four, or more microphones 170C may be alternatively disposed in the electronic device 100, to collect a sound signal and reduce noise. The microphones may further identify a sound source, implement a directional recording function, and the like.
(49) The pressure sensor 180A is configured to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed in the display 194.
(50) The barometric pressure sensor 180C is configured to measure barometric pressure. In some embodiments, the electronic device 100 calculates an altitude by using a barometric pressure value measured by the barometric pressure sensor 180C, to assist in positioning and navigation.
(51) The magnetic sensor 180D includes a Hall effect sensor. The electronic device 100 may detect opening/closing of a clamshell smart cover by using the magnetic sensor 180D.
(52) The acceleration sensor 180E may detect a magnitude of an acceleration of the electronic device 100 in each direction (usually three axes). When the electronic device 100 is still, a magnitude and a direction of gravity may be detected. The acceleration sensor 180E may be further configured to identify a posture of the electronic device, and is applied to applications such as landscape/portrait mode switching and a pedometer.
(53) The fingerprint sensor 180H is configured to collect a fingerprint. The electronic device 100 may implement fingerprint-based unlocking, application access lock, fingerprint-based photographing, fingerprint-based call answering, and the like by using characteristics of a collected fingerprint.
(54) The touch sensor 180K is also referred to as a touch panel. The touch sensor 180K may be disposed in the display 194, and the touch sensor 180K and the display 194 form a touchscreen, which is also referred to as a touch control screen. The touch sensor 180K is configured to detect a touch operation acting on or near the touch sensor. The touch sensor may transmit the detected touch operation to the application processor to determine a type of a touch event. Visual output related to the touch operation may be provided by using the display 194. In some other embodiments, the touch sensor 180K may be alternatively disposed on a surface of the electronic device 100 at a different location than the display 194.
(55) The bone conduction sensor 180M may obtain a vibration signal. In some embodiments, the bone conduction sensor 180M may obtain a vibration signal from a vibration bone of a human voice part.
(56) The following describes three application scenarios included in an image chromatic adaptation processing method provided in the embodiments of this application.
(57) Scenario 1: After an electronic device starts a camera application, the electronic device performs White balance processing and chromatic adaptation processing on an image acquired by the camera, and displays a processed image on a preview screen. When the electronic device 100 detects a tap operation on a photographing control, the electronic device takes a photo and stores the photo, where the photo is an image obtained through chromatic adaptation processing. The following describes the scenario 1 with reference to
(58) As shown in
(59) As shown in
(60) As shown in
(61)
(62) Scenario 2: After an electronic device 100 starts a camera application, an image preview window on a photographing screen displays an image of a photographing environment that is obtained through white balance processing. The photographing screen includes a chromatic adaptation processing control. When the electronic device 100 detects an input operation, for example, tapping, performed by a user on the chromatic adaptation processing control, the electronic device performs chromatic adaptation processing on a preview image in response to the operation, and displays a preview image obtained through chromatic adaptation processing in the image preview window. When the electronic device 100 detects an input operation, for example, tapping, on a capture control, the electronic device 100 takes a photo and stores a photo obtained through chromatic adaptation processing. The following describes the scenario 2 with reference to
(63) As shown in
(64) As shown in
(65) As shown in
(66) Scenario 3: An electronic device performs chromatic adaptation processing on a picture offline: A user can view a picture taken by the user in a gallery. When the electronic device 100 detects an input operation on a chromatic adaptation processing control, the electronic device 100 may perform chromatic adaptation processing on a photo in response to the input operation, to obtain and store a photo obtained through chromatic adaptation processing. The following describes the scenario 3 with reference to
(67) As shown in
(68) As shown in
(69) As shown in
(70) As shown in
(71) As shown in
(72) The foregoing describes, with reference to
(73) The following describes, with reference to
(74) Step S601: The electronic device predicts a light source by using a RAW image output by a camera, and calculates R.sub.g and B.sub.g of a white point of a light source in a photographing environment and RGB values of the white point of the light source in the photographing environment.
(75) Step S602: The electronic device calculates an x.sub.1 value and a y.sub.1 value of the white point of the light source in the environment based on the RGB values of the white point of the light source in the photographing environment, and then calculates CCT.sub.src and D.sub.uv_src of the light source in the photographing environment based on the x.sub.1 value and the y.sub.1 value.
(76) Step S603: The electronic device performs CCM interpolation based on the CCT of the light source in the photographing environment to obtain a color correction matrix CCM.
(77) Step S604: The electronic device performs white balance processing on the RAW image based on 1/R.sub.g and 1/B.sub.g to obtain a RAW image obtained through white balance processing.
(78) Step S605: The electronic device performs, by using the CCM, color restoration processing on the RAW image obtained through white balance processing to obtain a second image in an sRGB color space.
(79) Step S606: The electronic device converts the second image to an XYZ color space by using an sRGB-to-XYZ matrix to obtain a first converted image.
(80) Step S607: The electronic device converts the first converted image to an LMS color space by using an XYZ-to-LMS matrix to obtain a third image.
(81) Step S608: The electronic device performs chromatic adaptation processing on the third image to obtain a fourth image.
(82) Step S609: The electronic device converts the fourth image to the XYZ color space by using an LMS-to-XYZ matrix to obtain a second converted image.
(83) Step S610: The electronic device converts the second converted image to the sRGB color space by using an XYZ-to-sRGB matrix to obtain a third converted image.
(84) Step S611: The electronic device performs RGB color space image processing on the third converted image to obtain a fourth converted image.
(85) Specifically, the RGB color space image processing includes performing DRC (dynamic range compression) processing and/or gamma (GAMMA) processing on the third converted image.
(86) Step S612: The electronic device converts the fourth converted image from the sRGB color space to a YUV color space to obtain a fifth converted image.
(87) Step S613: The electronic device performs YUV color space processing on the fifth converted image to obtain a target image.
(88) Specifically, the YUV color space image processing includes performing tone mapping processing (Tone mapping processing) and/or 3D LUT processing on the image.
(89) In a possible implementation, after performing YIN color space processing on the fifth converted image, the electronic device may convert a processed image into a MEG format to obtain the target image.
(90) By processing the RAW image output by the camera based on the foregoing block diagram of image processing, the electronic device can obtain a chromatically adapted image consistent with a visual effect of human eyes.
(91) The following describes in detail the steps in the block diagram of
(92) Step S701: An electronic device starts a camera application in response to a first operation.
(93) For example, the first operation may be the tap operation on the camera icon in the embodiment of
(94) Step S702: The electronic device calculates chromaticity information of a white point of a light source in a photographing environment by using a white balance algorithm.
(95) Specifically, after the camera is started, the camera outputs a RAW image. The RAW image is a first image in a first color space, and the first color space is a RAW space. The RAW image is original data obtained by a CMOS image sensor or a CCD image sensor of the camera by converting a captured light source signal into a digital signal. After the camera outputs the RAW image, the electronic device may calculate RGB values of the white point of the light source in the photographing environment in an RGB color space, R.sub.g, and B.sub.g based on the white balance algorithm. The RGB values, R.sub.g, and B.sub.g are the chromaticity information of the white point of the light source in the photographing environment. The light source in the photographing environment is a first ambient light source, and the first ambient light source includes at least one ambient light source. For example, the ambient light source may be a solar light source or a lamp light source. R.sub.gain=1/R.sub.g, B.sub.gain=1/B.sub.g, R.sub.gain=G/R, and B.sub.gain=G/B, where R.sub.gain and B.sub.gain are respectively gain values on an R channel and a B channel to be used for white balance processing under the ambient light source, and R, G, and B are an R value, a G value, and a B value of an RGB channel respectively.
(96) It should be understood that the electronic device may calculate the Chromaticity information of the light source in the photographing environment by using a conventional automatic white balance algorithm (for example, a gray world algorithm), or may calculate the chromaticity information of the light source in the photographing environment by using an AI automatic white balance algorithm. A type of the white balance algorithm used by the electronic device is not limited in this embodiment of this application.
(97) Step S703: The electronic device performs white balance processing and color restoration processing on the RAW image output by the camera to obtain a second image.
(98) Specifically, to resolve a problem of an image color deviation caused by the light source in the photographing environment, the electronic device needs to perform white balance processing and color restoration processing on the RAW image output by the camera to obtain the second image. A specific process of performing, by the electronic device, white balance processing on the RAW image is as follows: The electronic device adjusts ROB values of each pixel of the RAW image to make values of three color channels of the white point of the light source of the image equal, that is, R=G=B. The electronic device may adjust the ROB values of the RAW image by using a formula (1), to perform white balance adjustment on the image. The formula (1) is shown below:
(99)
are RGB values of the i-th pixel of the RAW image, and
(100)
are RGB values of the i-th pixel of an image obtained through white balance adjustment.
(101) Color restoration by the camera is greatly different from an object color perceived by an observer due to a large difference between a spectral response of a semiconductor sensor and a spectral response of human eyes to visible light. In addition, ROB of the RAW image is a device-related ROB color space, but not a general color space. In addition, after white balance adjustment is performed on the RAW image, only some colors of a photographed object in the image are restored (for example, only neutral colors such as white or gray in the image are restored). Therefore, a degree of color restoration of an object in the image needs to be improved, and a color space of the image needs to be converted from the device-related Device ROB space to a device-independent sRGB space. Therefore, the electronic device needs to perform color restoration processing on the image obtained through white balance processing to improve a degree of color restoration of the image. The electronic device may perform, by using the color correction matrix (Color Correction Matrix, CCM), color restoration processing on the image obtained through white balance processing to obtain the second image.
(102) The electronic device may obtain, through calibration, a CCM with a size of 33 in different light source environments (typical light sources include A, H, U30, TL84, D50, D65, D75, and the like) and store the CCM on the electronic device. The electronic device may select a corresponding CCM by using the RGB values, calculated by the electronic device, of the white point of the light source in the photographing environment. If the RGB values are between two light sources (for example, the RGB values of the white point of the light source in the photographing environment fall between D50 and D65), the CCM may be obtained by performing bilinear interpolation by using D50 and D65. For example, a color correction matrix of D50 is CCM.sub.1, and a correlated color temperature is CCT.sub.1; a color correction matrix of D50 is CCM.sub.2, and a correlated color temperature is CCT.sub.2; and a correlated color temperature of the light source in the photographing environment is CCT.sub.a. The electronic device may obtain a ratio g based on a formula (2). The formula (2) is shown below:
(103)
(104) Then the electronic device may calculate, based on a formula (3), the CCM corresponding to the white point of the light source in the photographing environment:
CCM=g*CCM.sub.1+(1g)CCM.sub.2(3)
(105) After the electronic device calculates the CCM corresponding to the white point of the light source in the photographing environment, the electronic device may adjust the RGB values of the image by using a formula (4) to obtain the second image. The formula (4) is shown below:
(106)
are the RGB values of the i-th pixel of the RAW image obtained through white balance adjustment, and
(107)
are RGB values of the i-th pixel of the second image. In this case, a color space of the second image is an sRGB color space, and the sRGB color space is a second color space.
(108) It should be understood that the foregoing embodiment is merely example descriptions of performing, by the electronic device, white balance processing and color restoration processing on the image, and the electronic device may alternatively perform white balance processing and color restoration processing on the RAW image by using another method to obtain the second image. A manner of performing, by the electronic device, white balance processing and color restoration processing on the RAW image is not limited in this embodiment of this application.
(109) Step S704: The electronic device converts the second image from the sRGB color space to an XYZ color space to obtain a first converted image.
(110) Specifically, the second image obtained through white balance processing resolves a problem of a color deviation due to impact of the light source in the photographing environment. However, according to a theory of incomplete chromatic adaptation of human eyes, human eyes do not always maintain color constancy due to impact of ambient brightness and an ambient light source, that is, a color of an object really observed by human eyes always deviates from a real color of the object. Therefore, a color of the second image obtained through white balance processing and color restoration processing is different from a color observed by human eyes under some ambient light sources. To make the color of the image consistent with that observed by human eyes, color adjustment may be performed on the image in an LMS color space, so that the color of the image is consistent with the color really observed by human eyes. The LMS color space is a third color space.
(111) The color space is a color space represented by a response of three types of cones of human eyes, and is named after their responsivity at a long wavelength, a medium wavelength, and a short wavelength. Chromatic adaptation processing may be performed on an image in the LMS color space, so that a color of a processed image is more consistent with a color really observed by human eyes. Because the LMS color space may be converted from the XYZ color space, the electronic device may convert the color space of the second image from the sRGB color space to the XYZ color space to obtain the first converted image.
(112) The electronic device may convert the second image by using a formula (5) to obtain the first converted image. The formula (5) is shown below:
(113)
are the RGB values of the i-th pixel of the second image,
(114)
are XYZ values of the i-th pixel of the first converted image, and M.sub.1 a conversion matrix with a size of 33, where the conversion matrix is an sRGB-to-XYZ matrix, and is used to convert the second image from the sRGB color space to the XYZ color space. For example, a form of M.sub.1 may be as follows:
(115)
(116) Step S705: The electronic device converts the first converted image from the XYZ color space to the LMS color space to obtain a third image.
(117) Specifically, the LMS color space is a color space represented by a response of three types of cones of human eyes, and is named after their responsivity at a long wavelength, a medium wavelength, and a short wavelength. Chromatic adaptation processing may be performed on an image in the LMS color space, so that a color of a processed image is more consistent with a color really observed by human eyes. For example, the electronic device may convert the first converted image from the XYZ color space to the LMS color space by using a formula (6). The formula (6) is shown below:
(118)
are the XYZ values of the i-th pixel of the first converted image in the XYZ color space, Meat, is an XYZ-to-LMS matrix, and is used to convert an image from the XYZ color space to the LMS color space, and
(119)
are LMS values of the i-th pixel of the third image in the LMS color space. For example, Mcat.sub.1 may be as follows:
(120)
(121) Step S706: The electronic device performs chromatic adaptation processing on the third image in the LMS color space to obtain a fourth image.
(122) Specifically, to make a color of an image captured by the electronic device more consistent with a color of an image really observed by human eyes, the electronic device needs to perform chromatic adaptation processing on the third image, so that the color of the captured image conforms to the theory of incomplete chromatic adaptation of human eyes, and is closer to the color really observed by human eyes. The electronic device may perform chromatic adaptation processing on the third image in the LMS color space by using a chromatic adaptation conversion matrix CA to obtain the fourth image.
(123) The following describes a specific process of calculating the chromatic adaptation conversion matrix by the electronic device with reference to
(124) Step S801: The electronic device calculates a degree D of chromatic adaptation based on an x.sub.1 value and a y.sub.1 value of a white point of a light source in a photographing environment in an xy chromaticity diagram.
(125) Specifically, the degree D of chromatic adaptation is a parameter of a chromatic adaptation model, is used to represent a degree of chromatic adaptation of the chromatic adaptation model under different light source conditions, and is mainly determined by a CCT of an ambient light source and ambient brightness La (unit: candela/square meter). The electronic device may calculate Lv of the photographing environment, and then convert Lv into calculation of La. For a calculation method tier Lv, refer to related explanations of the foregoing technical term (4). Details are not described in this embodiment of this application again. A value range of D is [0, 1]. When D is 0, it indicates that the chromatic adaptation model is completely chromatically non-adaptive to the ambient light source. To be specific, the chromatic adaptation model is affected by the CCT of the ambient light source CCT and La, and an obtained object color greatly deviates from a real color of an object. When D is 1, it indicates that the chromatic adaptation model is completely chromatically adaptive to the ambient light source. To be specific, the chromatic adaptation model is barely affected by the CCT of the ambient light source or La, an obtained object color barely deviates from a real color of an object, and a larger value of the degree D of chromatic adaptation indicates a higher degree of chromatic adaptation of the chromatic adaptation model.
(126) The chromatic adaptation model is a model designed by researchers by collecting corresponding color data sets and performing fitting through massive experiments to simulate chromatic adaptation of human eyes to an environment, and the corresponding color data sets are obtained through a psychophysical experiment. The psychophysical experiment is intended to enable an observer to find a corresponding color matching a color under two different lighting conditions. For example, the observer is enabled to adjust or select a memory color of a familiar object under a daylight condition and a reference light source (a D65 light source), to obtain a plurality of sets of corresponding color data under various light sources and the D65 light source.
(127) The electronic device may calculate a first variable P.sub.xy by using a formula (7). The formula (7) is shown below:
P.sub.xy={square root over ((x.sub.1x.sub.N).sup.2+(y.sub.1y.sub.N).sup.2)}(7), where
(128) P.sub.xy is the first variable, and is used to represent a distance between the white point of the light source in the photographing environment and a reference white point (in this embodiment of this application, an example in which a reference light source is the D65 light source is used for description), the x.sub.1 value and the y.sub.1 value are respectively an x value and a y value of the white point of the light source in the photographing environment in the xy chromaticity diagram, x.sub.1 is used to identify the x value of the white point of the light source in the environment in the xy chromaticity diagram, y.sub.1 is used to identify the y value of the white point of the light source in the environment in the xy chromaticity diagram, and x.sub.N and y.sub.N are respectively an x value (x.sub.N=0.3127) and a y value (y.sub.N=0.3290) of the reference white point in the xy chromaticity diagram.
(129) A first parameter includes the x.sub.1 value and the y.sub.1 value. The x.sub.1 value and the y.sub.1 value may be obtained by using RGB values of the white point of the light source in the photographing environment. For example, the RGB values of the white point of the light source in the photographing environment may be converted into X.sub.1Y.sub.1Z.sub.1 values of the white point of the light source in the photographing environment in an XYZ color space by using a CCM, and the x.sub.1 value and the y.sub.1 value are obtained based on the X.sub.1Y.sub.1Z.sub.1 values. Calculation formulas related to the x.sub.1 value and the y.sub.1 value may be shown in a formula (8) to a formula (10):
(130)
(131) After calculating the first variable, the electronic device may calculate a second variable D.sub.c by using a formula (11). The formula (11) is shown below:
D.sub.c=1e.sup.5.3.Math.P.sup.
(132) After calculating the second variable D.sub.c, the electronic device may calculate the degree D of chromatic adaptation based on a formula (12). The formula (12) is shown below:
D=0.96.Math.D.sub.c((1e.sup.(4.28.Math.log La)).sup.406.51)+1(12), where
(133) e.sup.(4.28.Math.log La) is an exponential function whose base is a constant e, and e is approximately 2.71828.
(134) It should be understood that the foregoing calculation method for the degree D of chromatic adaptation is only an example calculation method for the degree D of chromatic adaptation for description in this embodiment of this application. A calculation method for the degree D of chromatic adaptation is not limited in this embodiment of this application.
(135) Step S802: The electronic device calculates L.sub.curM.sub.curS.sub.cur values of a target white point in an LMS color space based on the degree D of chromatic adaptation and the x.sub.1 value and the y.sub.1 value of the white point of the light source in the photographing environment in the xv chromaticity diagram.
(136) Specifically, the target white point is a white point calculated by using a chromatic adaptation algorithm, and is a white point, predicted by the electronic device, of the light source, and the L.sub.curM.sub.curS.sub.cur values are the LMS values of the target white point in the LMS color space. A calculation method for the L.sub.curM.sub.curS.sub.cur values of the target white point is as follows:
(137) Because a Y value in the XYZ color space is used to indicate brightness, the electronic device may set a Y.sub.src value to be different constants m according to a color brightness requirement. When m=1, it indicates that the electronic device does not adjust image brightness when calculating a chromatic adaptation conversion relationship. An example in which Y.sub.src=1.0 is used for description in this embodiment of this application. After determining Y.sub.src, the electronic device may calculate a third variable i.sub.y by using a formula (13). The formula (13) is shown below:
(138)
(139) Then the electronic device calculates, based on a formula (14) and a formula (15), an X.sub.src value and a Z.sub.src value of the white point of the light source in the photographing environment in the XYZ color space when Y.sub.src=1. The formula (14) and the formula (15) are shown below:
X.sub.src=x.sub.1.Math.Y.sub.src.Math.i.sub.y(14);
Z.sub.src=Y.sub.src.Math.(1x.sub.1y.sub.i).Math.i.sub.y(15)
(140) In this way, the electronic device may calculate XYZ values, namely, the X.sub.srcY.sub.srcZ.sub.src values, of the white point of the light source in the photographing environment in the XYZ color space by using the formula (15). After calculating the X.sub.srcY.sub.srcZ.sub.src values, the electronic device may obtain LMS values, namely, the L.sub.srcM.sub.srcS.sub.src values, of the white point of the light source in the photographing environment in the LMS color space based on the X.sub.srcY.sub.srcZ.sub.src values. For example, the electronic device may calculate the L.sub.srcM.sub.srcS.sub.src values of the white point of the light source in the photographing environment in the LMS color space by using a formula (16). The formula (16) is shown below:
(141)
(142) Likewise, the electronic device may convert XYZ values of the reference white point (in this embodiment of this application, an example in which a reference light source is the D65 light source is used for description) (under the D65 light source, the XYZ values of the reference white point are 0.95047, 1, and 1.08883 respectively) in the XYZ color space into L.sub.dstM.sub.dstS.sub.dst values in the LMS color space based on the formula (16).
(143) After calculating the L.sub.dstM.sub.dstS.sub.dst values of the reference white point and the L.sub.srcM.sub.srcS.sub.src values of the white point of the light source in the photographing environment, the electronic device may calculate a gain matrix M.sub.Gain1 of the reference white point and the white point of the light source in the photographing environment in the LMS color space. The electronic device may calculate M.sub.Gain1 by using a formula (17). The formula (17) is shown below:
(144)
(145) After calculating the gain matrix M.sub.Gain1, the electronic device may calculate the LMS values, namely, the L.sub.curM.sub.curS.sub.cur values, of the target white point in the LMS based on M.sub.Gain1 and the degree D of adaptation. The electronic device may calculate the L.sub.curM.sub.curS.sub.cur values by using a formula (18), and calculate XYZ values, namely, X.sub.curY.sub.curZ.sub.cur values, of the target white point in the XYZ color space by using a formula (19). The formula (18) and the formula (19) are shown below:
(146)
is a gain matrix of the white point of the light source in the photographing environment and the reference white point in the LMS color space when the degree of chromatic adaptation is D.
(147) Step S803: The electronic device calculates a chromatic adaptation conversion matrix CA based on the L.sub.curM.sub.curS.sub.cur values of the target white point and the L.sub.dstM.sub.dstS.sub.dst values of the reference white point.
(148) Specifically, the L.sub.dstM.sub.dstS.sub.dst values are LMS values of the reference white point in the LMS color space. The electronic device may calculate the chromatic adaptation conversion matrix CA based on a formula (20). The chromatic adaptation conversion matrix CA is a second parameter. The formula (20) is shown below:
(149)
(150) The foregoing step S801 to step S803 describe an example of a specific process of calculating a CA matrix by an electronic device. It should be understood that step S801 to step S803 are performed after the electronic device calculates the Chromaticity information of the white point of the light source in the photographing environment by using the white balance algorithm (after the electronic device performs step S701) and before the electronic device performs chromatic adaptation processing on the third image. It can be understood that, it can be learned based on the calculation process that the chromatic adaptation conversion matrix CA is a parameter related to the LMS color space, the chromatic adaptation conversion matrix CA has a correspondence with the x.sub.1 value and the y.sub.1 value, and the chromatic adaptation conversion matrix CA can identify light source information of an ambient light source in the LMS color space during photographing. Because the LMS color space has a conversion relationship with the XYZ color space, it can be understood that the chromatic adaptation conversion matrix CA is also a parameter related to the XYZ color space, and the chromatic adaptation conversion matrix CA can identify light source information of an ambient light source in the XYZ color space during photographing.
(151) After the electronic device calculates the CA matrix, the electronic device may adjust LMS values of the third image by using the CA matrix to obtain the chromatically adapted fourth image. For example, the electronic device may adjust the LMS values of the third image in the LMS color space by using a formula (21) to obtain the fourth image. The formula (21) is shown below:
(152)
are LMS values of the i-th pixel of the third image in the LMS color space, and
(153)
are LMS values of the i-th pixel of the fourth image in the LMS color space.
(154) Step S707: The electronic device converts the fourth image from the LMS color space to the XYZ color space to obtain a second converted image.
(155) For example, the electronic device may convert the fourth image from the LMS color space to the XYZ color space by using a formula (22) to obtain the second converted image. The formula (22) is shown below:
(156)
are XYZ values of the i-th pixel of the second converted image in the XYZ color space, and Mcat z is an LMS-to-XYZ matrix, and is used to convert an image from the LMS color space to the XYZ color space. For example, Mcat.sub.2 may be as follows:
(157)
(158) Step S708: The electronic device converts the second converted image from the XYZ color space to the sRGB color space to obtain a third converted image.
(159) Specifically, the second converted image is an image in the XYZ color space under the target white point. The electronic device needs to convert the second converted image from the XYZ color space to the sRGB color space to obtain the third converted image, so that the electronic device can subsequently perform gamma correction processing, tone mapping or other image processing steps on the third converted image. For example, the electronic device may convert the third image from the XYZ color space to the sRGB color space by using a formula (23). The formula (23) is shown below:
(160)
are the XYZ values of the i-th pixel of the second converted image in the XYZ color space,
(161)
are RGB values of the i-th pixel of the third converted image in the sRGB color space, M.sub.2 is a conversion matrix for converting an image from the XYZ color space to the sRGB color space, and M.sub.2 may be as follows:
(162)
(163) Step S709: The electronic device performs RGB color space image processing on the third converted image to obtain a fourth converted image.
(164) Specifically, the RGB color space image processing includes performing DRC (dynamic range compression) processing and/or gamma (GAMMA) processing on the third converted image.
(165) Step S710: The electronic device converts the fourth converted image from the sRGB color space to a YUV color space to obtain a fifth converted image.
(166) Step S711: The electronic device performs YUV color space processing on the fifth converted image to obtain a target image.
(167) Specifically, the YUV color space image processing includes performing tone mapping processing (Tone mapping processing) and/or 3D LUT processing on the image.
(168) In a possible implementation, after performing YUV color space processing on the fifth converted image, the electronic device may convert a processed image into a JPEG format to obtain the target image.
(169) The foregoing step S701 to step S711 describe a specific process of performing, by the electronic device, chromatic adaptation processing on an image.
(170) In a possible implementation, the electronic device calculates the Chromatic adaptation conversion matrix CA by querying a correlated color temperature shift table (CCT Shift Table) and a D.sub.uv, shift table (D.sub.uv Shift Table); performs chromatic adaptation on the third image based on the CA to obtain the fourth image; converts the fourth image from the LMS color space to the sRGB color space to obtain the third converted image; and then performs a series of image processing on the third converted image to obtain the target image.
(171) The electronic device may find a correlated color temperature CCT.sub.cur of the target white point and D.sub.uv of the target white point, namely, D.sub.uv_cur, in a preset CCT shift table (CCT Shift Table) and D.sub.uv shift table (D.sub.uv Shift Table) by using a correlated color temperature CCT.sub.src of the light source in the photographing environment and D.sub.uv of the white point of the light source in the photographing environment, namely, D.sub.uv_src. Then the electronic device calculates the L.sub.curM.sub.curS.sub.cur values of the target white point by using a conversion relationship between a CCT, D.sub.uv, and XYZ values, to calculate the CA matrix; adjusts the third image based on the CA matrix; and then obtains the third converted image in the sRGB color space based on the third image. The CCT shift table and the D.sub.uv shift table are shown in
(172) The electronic device may build a three-dimensional correlated color temperature shift table (CCT Shift Table) and D.sub.uv shift table (D.sub.uv Shift Table) based on the foregoing formula (7) to formula (19) and related conversion formulas of a CCT, D.sub.uv, and XYZ values; and may obtain CCT.sub.cur and D.sub.uv_cur from the CCT shift table and the D.sub.uv shift table by using Lv, CCT.sub.src, and D.sub.uv_src of the light source in the photographing environment, and then obtain the x.sub.1 value and the y.sub.1 value by using a formula for conversion from CCT.sub.cur and D.sub.uv_cur to the x.sub.1 value and the y.sub.1 value, so as to calculate the X.sub.curY.sub.curZ.sub.cur values of the target white point in the XYZ color space based on the x.sub.1 value and the y.sub.1 value.
(173) The electronic device may calculate D.sub.uv_src of the white point of the light source in the photographing environment based on the x.sub.1 value and the y.sub.1 value. It should be understood that there are a plurality of calculation methods for D.sub.uv_src. This embodiment of this application describes only one of the calculation methods for D.sub.uv_src as an example. A specific method used by the electronic device to calculate D.sub.uv_src is not limited in this embodiment of this application. A specific method for calculating D.sub.uv_src by the electronic device is as follows:
(174) The electronic device may calculate a u.sub.1 value and a v.sub.1 value of the white point of the light source in the photographing environment in a uv chromaticity diagram based on a formula (24) and a formula (25) respectively. The formula (24) and the formula (25) are shown below:
u.sub.1=(4x.sub.1)/(2x.sub.1+12y.sub.1+3)(24)
v.sub.1=(6y.sub.1)/(2x.sub.1+12y.sub.1+3)(25)
(175) Then the electronic device may calculate a third parameter L.sub.Fp, a fourth parameter a.sub.1, and a fifth parameter L.sub.BB based on a formula (26) to a formula (28). The formula (26) to the formula (28) are shown below:
L.sub.Fp={square root over ((u.sub.10.292).sup.2+(v.sub.10.24).sup.2)}(26)
a.sub.1=arctan((v.sub.10.24)/(u.sub.10.292))(27)
L.sub.BB=k.sub.06a.sup.6+k.sub.05a.sup.5+k.sub.04a.sup.4+k.sub.03a.sup.3+k.sub.02a.sup.2+k.sub.01a+k.sub.00(28), where
(176) k.sub.06, k.sub.05, k.sub.04, k.sub.03, k.sub.02, k.sub.01, and k.sub.00 are constants; and when a.sub.1 is greater than or equal to 0, a=a.sub.1; or when a.sub.1 is less than or equal to 0, a=a.sub.1+.
(177) Finally, the electronic device may calculate D.sub.uv_src by using a formula (29). The formula (29) is shown below:
D.sub.uv_src=L.sub.FpL.sub.BB(29)
(178) After calculating D.sub.uv_src, the electronic device may calculate the correlated color temperature CCT.sub.src of the white point of the light source in the photographing environment based on a formula (30). The formula (30) is shown below:
CCT.sub.src=T.sub.2T.sub.c2(30), where
T.sub.2=T.sub.1T.sub.c1; when a is less than 2.54:
T.sub.1=1/(k.sub.16a.sup.6+k.sub.15a.sup.5+k.sub.14a.sup.4+k.sup.13a.sup.3+k.sub.12a.sup.2+k.sub.11a+k.sub.10);
T.sub.c1=(k.sub.36a.sup.6+k.sub.35a.sup.5+k.sub.34a.sup.4+k.sub.33a.sup.3+k.sub.32a.sup.2k.sub.31a+k.sub.30)(L.sub.BB+0.01)/L.sub.FpD.sub.uv_cur/0.01; or
(179) when a is greater than or equal to 2.54:
T.sub.1=1/(k.sub.26a.sup.6+k.sub.25a.sup.5+k.sub.24a.sup.4+k.sub.23a.sup.3+k.sub.22a.sup.2k.sub.21a+k.sub.20);
T.sub.c1=(k.sub.46a.sup.6++k.sub.45a.sup.5+k.sub.44a.sup.4+k.sub.43a.sup.3+k.sub.42a.sup.2+k.sub.41+k.sub.40)(L.sub.BB+0.01)/L.sub.FpD.sub.uv_cur/0.01; and
(180) when D.sub.uv_src is greater than or equal to 0:
T.sub.c2=(k.sub.56c.sup.6+k.sub.55c.sup.5+k.sub.54c.sup.4+k.sub.53c.sup.3+k.sub.52c.sup.2k.sub.51c+k.sub.50); or
(181) when D.sub.uv_src is less than 0:
T.sub.c2(k.sub.66c.sup.6+k.sub.65c.sup.5+k.sub.64c.sup.4+k.sub.63c.sup.3+k.sub.62c.sup.2+k.sub.61c+K.sub.60).Math.|D.sub.uv_src/0.03|.sup.2, where
(182) c=log(T.sub.2), and k.sub.10 to k.sub.16, K.sub.20 to k.sub.26, k.sub.30 to k.sub.36, k.sub.40 to k.sub.46 k.sub.50 to k.sub.56, and k.sub.60 to k.sub.66 are constants.
(183) It should be understood that there are a plurality of calculation methods for CCT.sub.src. This embodiment of this application describes only one of the calculation methods as an example. A calculation method fir CCT.sub.src is not limited in this embodiment of this application.
(184) Likewise, the electronic device may calculate D.sub.uv_cur of the target white point by using the D.sub.uv shift table. Then the electronic device may calculate the X.sub.curY.sub.curZ.sub.cur values of the target white point in the XYZ color space by using CCT.sub.cur and D.sub.uv_cur.
(185) The following describes an example of calculating, by the electronic device, the X.sub.curY.sub.curZ.sub.cur values of the target white point in the XYZ color space by using CCT.sub.cur and D.sub.uv_cur.
(186) The electronic device may obtain a black body radiation curve L() whose color temperature is CCT.sub.cur based on a formula (31). The formula (31) is shown below:
(187)
where
(188) c.sub.1 is a first constant, c.sub.2 is a second constant, and is a wavelength. Then the electronic device calculates an x.sub.2 value and a y.sub.2 value of black body radiation light when the color temperature is CCT.sub.cur. A specific process is as follows: The electronic device may calculate XYZ values of the black body radiation light in the XYZ color space based on a formula (32) to a formula (34). The formula (31) to the formula (33) are shown below:
(189)
where
(190) X.sub.2 is an X value of the black body radiation light in the XYZ color space, Y.sub.2 is a Y value of the black body radiation light in the XYZ color space, Z.sub.2 is a Z value of the black body radiation light in the XYZ color space, k is a constant, and
(191) Then the electronic device may calculate an x.sub.2 value and a y.sub.2 value of the black body radiation light in the xy chromaticity diagram based on X.sub.2Y.sub.2Z.sub.2. For calculation formulas for the x.sub.2 value and the y.sub.2 value, refer to the foregoing formula (9) and formula (10). Details are not described herein again in this embodiment of this application.
(192) Then the electronic device converts the x.sub.2 value and the y.sub.2 value into a u.sub.2 value and a v.sub.2 value in the uv chromaticity diagram. For calculation formulas for the u.sub.2 value and the v.sub.2 value, refer to the foregoing formula (24) and formula (25). Details are not described herein again in this embodiment of this application.
(193) Likewise, the electronic device may calculate, based on the foregoing Formula (31), a black body radiation curve L.sub.1() whose color temperature is CCT.sub.cur (in this embodiment of this application, an example in which Delta_T is 0.01 is used for description), to calculate a u.sub.3 value and a v.sub.3 value of the black body radiation light whose color temperature is CCT.sub.cur+Delta_T in the CIE uv chromaticity diagram.
(194) Then the electronic device may calculate a u.sub.cur value and a v.sub.cur value of the target white point in the CIE uv chromaticity diagram based on a formula (35) and a formula (36). The formula (35) and the formula (36) are shown below:
(195)
(196) Then the electronic device may calculate an x.sub.cur value and a y.sub.cur value of the target white point in the CIE xy chromaticity diagram based on a formula (37) and a formula (38). The formula (37) and the formula (38) are shown below:
x.sub.cur=9u.sub.cur/(6u.sub.cur24v.sub.cur+12)(37);
y.sub.cur=3v.sub.cur/(3u.sub.cur12v.sub.cur+6)(38)
(197) The electronic device may calculate X.sub.curY.sub.curZ.sub.cur of the target white point based on the x.sub.cur value and the y.sub.cur value. For a process and a method of calculating X.sub.curY.sub.curZ.sub.cur by the electronic device based on the x cur value and the y.sub.cur value, refer to related descriptions of calculating, by the electronic device, the X.sub.srcY.sub.srcZ.sub.src values of the white point of the light source in the photographing environment in the XYZ color space based on the x.sub.1 value and the y.sub.1 value of the white point of the light source in the photographing environment in the CIE xy chromaticity diagram in step S802. Details are not described herein again in this embodiment of this application.
(198) After calculating X.sub.curY.sub.curZ.sub.cur, the electronic device may convert X.sub.curY.sub.curZ.sub.cur into L.sub.curM.sub.curS.sub.cur, and convert X.sub.curY.sub.curZ.sub.cur into L.sub.curM.sub.curS.sub.cur. Refer to the foregoing formula (16). Details are not described herein again in this embodiment of this application. After calculating L.sub.curM.sub.curS.sub.cur, the electronic device may calculate the CA matrix according to the method described in step S803, and then adjust the third image by using the CA matrix based on the foregoing formula (21) to obtain the fourth image.
(199) Finally, the electronic device performs step S707 to step S711 to obtain the target image.
(200) In a possible implementation, the electronic device may perform Chromatic adaptation processing on the third image in the LMS color space by using a nonlinear method. The following describes an example of performing, by the electronic device, chromatic adaptation processing on the third image in the LMS color space by using a nonlinear method. The electronic device may calculate the L.sub.curM.sub.curS.sub.cur values of the target white point in the LMS color space by using the following formulas:
L.sub.cur[D(L.sub.dst/L.sub.src)+1D].Math.L.sub.src;
M.sub.cur=[D(M.sub.dst/M.sub.src)+1D].Math.M.sub.src;
S.sub.cur=[D(S.sub.dst/S.sub.src.sup.)+1D].Math.S.sub.src.sup., where
(201) =(S.sub.dst/S.sub.src).sup.0.0834. Then the electronic device may adjust the LMS values of the third image in the LMS color space by using the following formulas to obtain the fourth image:
L.sub.4i=(L.sub.cur/L.sub.dst).Math.L.sub.3i;
M.sub.41=(M.sub.cur/M.sub.dst).Math.M.sub.3i;
S.sub.4i=(S.sub.cur/S.sub.dst.sup.1).Math.S.sub.3i.sup.1, where
1=(S.sub.cur/S.sub.dst).sup.0.0834.
(202) Then the electronic device performs step S707 to step S711 to obtain the target image.
(203) In this embodiment of this application, the electronic device calculates the chromaticity information of the light source in the photographing environment by using the white balance algorithm; performs white balance processing and color restoration on an image to obtain a processed image, where the processed image is an image under the reference light source; and converts the image to the LMS color space. Then the electronic device calculates the chromatic adaptation conversion matrix based on the chromaticity information of the light source in the photographing environment and the chromatic adaptation algorithm, and adjusts the image in the LMS color space by using the chromatic adaptation conversion matrix, to obtain a chromatically adapted image close to a visual effect of human eyes when the degree D of chromatic adaptation is 1, obtain a chromatically adapted image close to a visual effect of human eyes when the degree D of chromatic adaptation is 0, and obtain a chromatically adapted image close to a visual effect of human eyes when the degree D of chromatic adaptation is between 0 and 1.
(204) The foregoing embodiment of
(205) Step S1101A: The electronic device predicts a light source by using a RAW image output by a camera, and calculates R.sub.g and B.sub.g of a white point of a light source in a photographing environment and RGB values of the white point of the light source in the photographing environment.
(206) Step S1102A: The electronic device calculates an x.sub.1 value and a y.sub.1 value of the white point of the light source in the environment based on the RGB values of the white point of the light source in the photographing environment, and then calculates CCT.sub.src and D.sub.uv_src of the light source in the photographing environment based on the x.sub.1 value and the y.sub.1 value.
(207) Step S1103A: The electronic device performs CCM interpolation based on the CCT of the light source in the photographing environment to obtain a color correction matrix CCM.
(208) Step S1104A: The electronic device performs white balance processing on the RAW image based on 1/R.sub.g and 1/B.sub.g to obtain a RAW image obtained through white balance processing.
(209) Step S1105A: The electronic device performs, by using the CCM, color restoration processing on the RAW image obtained through white balance processing to obtain a second image in an sRGB color space.
(210) For step S1101A to step S1105A, refer to related descriptions of step S601 to step S605. Details are not described again in this embodiment of this application.
(211) Step S1106A: The electronic device adjusts the second image by using a target matrix to obtain a third converted image in the sRGB color space.
(212) Specifically, the electronic device may adjust the second image by using a formula (39) to obtain the third converted image. The formula (39) is shown below:
(213)
are RGB values of the i-th pixel of the second image,
(214)
are RGB values of the i-th pixel of the third converted image in the sRGB color space, and CA.sub.1 is the target matrix.
(215) For example, CA.sub.1=M.sub.2.Math.Mcat.sub.2CA.Math.Mcat.sub.1.Math.M.sub.1, where M.sub.1 is a conversion matrix for conversion from the sRGB color space to an XYZ color space. M.sub.2 is a conversion matrix for conversion from the XYZ color space to the sRGB color space, Mcat.sub.1 is a conversion matrix for conversion from the XYZ color space to an LMS color space, Mcat.sub.2 is a conversion matrix for conversion from the LMS color space to the XYZ color space, CA is a chromatic adaptation conversion matrix, and CA may be obtained by using the foregoing formula (7) to formula (20). Details are not described herein again in this embodiment of this application.
(216) Step S1107A: The electronic device performs RGB color space image processing on the third converted image to obtain a fourth converted image. Specifically, the RGB color space image processing includes performing DRC (dynamic range compression) processing and/or gamma (GAMMA) processing on the third converted image.
(217) Step S1108A: The electronic device converts the fourth converted image from the sRGB color space to a YIN color space to obtain a fifth converted image.
(218) Step S1109A: The electronic device performs YIN color space processing on the fifth converted image to obtain a target image.
(219) In a possible implementation, after performing YUV color space processing on the fifth converted image, the electronic device may convert, a processed image into a MEG format to obtain the target image.
(220) For step S1107A to step S1109A, refer to related descriptions of step S709 to step S711. Details are not described again in this embodiment of this application.
(221) The following describes in detail the steps in the block diagram of
(222) Step S1101B: An electronic device starts a camera application in response to a first operation.
(223) Step S1102B: The electronic device calculates chromaticity information of a white point of a light source in a photographing environment by using a white balance algorithm.
(224) Step S1103B: The electronic device performs white balance processing and color restoration processing on the RAW image output by the camera to obtain a second image.
(225) For step S1101B to step S1103B, refer to step S701 to step S703. Details are not described herein again in this embodiment of this application.
(226) Step S1104B: The electronic device adjusts the second image b using a target matrix to obtain a third converted image in an sRGB color space.
(227) For step S1104B, refer to related descriptions of step S1106A. Details are not described herein again in this embodiment of this application.
(228) Step S1105B: The electronic device performs RGB color space image processing on the third converted image to obtain a fourth converted image.
(229) Step S1106B: The electronic device converts the fourth converted image from the sRGB color space to a YUV color space to obtain a fifth converted image.
(230) Step S1107B: The electronic device performs YIN color space processing on the fifth converted image to obtain a target image.
(231) In a possible implementation, after performing YUV color space processing on the fifth converted image, the electronic device may convert a processed image into a MEG format to obtain the target image.
(232) For step S1105B to step S1107B, refer to step S709 to step S711. Details are not described herein again in this embodiment of this application.
(233) In this embodiment of this application, the second image is multiplied by the target matrix to obtain the third converted image that is in the sRGB color space and that is obtained through chromatic adaptation processing, and a series of image processing is performed on the third converted image to obtain a chromatically adapted image close to a visual effect of human eyes, thereby reducing a quantity of times of calculation by the electronic device, and saving computing resources of the electronic device.
(234) The following describes, with reference to
(235) Step S1201: An electronic device starts a camera application in response to a first operation.
(236) Step S1202: The electronic device calculates chromaticity information of a white point of a light source in a photographing environment by using a white balance algorithm.
(237) Step S1203: The electronic device performs white balance processing and color restoration processing on the RAW image output by the camera to obtain a second image.
(238) For related descriptions of step S1201 to step S1203, refer to related descriptions of step S701 to step S5703. Details are not described again in this embodiment of this application.
(239) Step S1204: The electronic device converts a first target image from an sRGB color space to an XYZ color space in response to a second operation to obtain a first converted image.
(240) Specifically, the second operation may be the tap operation on the chromatic adaptation processing control 404 in
(241) Step S1205: The electronic device converts the first converted image from the XYZ color space to an LMS color space to obtain a third image.
(242) Step S1206: The electronic device performs chromatic adaptation processing on the third image in the LMS color space to obtain a fourth image.
(243) Step S1207: The electronic device converts the fourth image from the LMS color space to the XYZ color space to obtain a second converted image.
(244) Step S1208: The electronic device converts the second converted image from the XYZ color space to the sRGB color space to obtain a third converted image.
(245) Step S1209: The electronic device performs RGB color space image processing on the third converted image to obtain a fourth converted image.
(246) Step S1210: The electronic device converts the fourth converted image from the sRGB color space to a YUV color space to obtain a fifth converted image.
(247) Step S1211: The electronic device performs YUV color space processing on the fifth converted image to obtain a target image.
(248) In a possible implementation, after performing YUV color space processing on the fifth converted image, the electronic device may convert a processed image into a MEG format to obtain the target image.
(249) For related descriptions of step S1204 to step S1211, refer to related descriptions of step S704 to step S711. Details are not described again in this embodiment of this application.
(250) The following describes, with reference to
(251) Step S1301: An electronic device displays a first screen in response to a third operation.
(252) Specifically, the third operation may be tapping a first application image in a gallery application, the first application image is a photo taken by a camera, and the first screen may be a viewing screen for the first application image. For example, the viewing screen may be the user interface 52 in the embodiment of
(253) Step S1302: When the electronic device detects a second operation on a first function control on the first screen, the electronic device converts the first application image from an sRGB color space to an XYZ color space in response to the operation to obtain a first converted image.
(254) Specifically, the first function control may be the chromatic adaptation processing function box in the embodiment of
(255) Step S1303: The electronic device converts the first converted image from the XYZ color space to an LMS color space to obtain a third image.
(256) For step S1303, refer to step S705. Details are not described again in this embodiment of this application.
(257) Step S1304: The electronic device reads a chromatic adaptation conversion matrix of the first application image.
(258) Specifically, in a process in which the electronic device performs photographing by using the camera to obtain the first application image, the electronic device calculates the chromatic adaptation conversion matrix CA of the first application image, and stores the first application image and its corresponding chromatic adaptation conversion matrix to an image or an internal memory. When the electronic device needs to perform chromatic adaptation processing on the first application image by using the chromatic adaptation conversion matrix, the electronic device may read the chromatic adaptation conversion matrix. For a method and a process of calculating, by the electronic device, the chromatic adaptation conversion matrix of the first application image in the photographing process, refer to related descriptions in the embodiment of
(259) It should be understood that step S1304 only needs to be performed before step S1305, and an execution sequence of step S1304 is not limited in this embodiment of this application.
(260) Step S1305: The electronic device performs chromatic adaptation processing on the third image in the LMS color space to obtain a fourth image.
(261) Step S1306: The electronic device converts the fourth image from the LMS space to the XYZ color space to obtain a second converted image.
(262) Step S1307: The electronic device converts the second converted image from the XYZ color space to the sRGB color space to obtain a third converted image.
(263) Step S1308: The electronic device performs RGB color space image processing on the third converted image to obtain a fourth converted image.
(264) Step S1309: The electronic device converts the fourth converted image from the sRGB color space to a YIN color space to obtain a fifth converted image.
(265) Step S1310: The electronic device performs YUV color space processing on the fifth converted image to obtain a target image.
(266) In a possible implementation, after performing YUV color space processing on the fifth converted image, the electronic device may convert a processed image into a JPEG format to obtain the target image.
(267) For related descriptions of step S1305 to step S1310, refer to related descriptions of step S706 to step S711 in the embodiment of
(268) All or some of the foregoing embodiments may be implemented by software, hardware, firmware, or any combination thereof. When software is used to implement the embodiments, some or all of the embodiments may be implemented in a form of a computer program product. The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, all or some of the processes or functions according to this application are generated. The computer may be a general-purpose computer, a dedicated computer, a computer network, or another programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or may be transmitted from a computer-readable storage medium to another computer-readable storage medium. For example, the computer instructions may be transmitted from a website, computer, server, or data center to another website, computer, server, or data center in a wired manner (for example, a coaxial cable, an optical fiber, or a digital subscriber line) or a wireless manner (for example, infrared, radio, or microwave), The computer-readable storage medium may be any usable medium accessible to a computer, or a data storage device, such as a server or a data center, integrating one or more usable media. The usable medium may be a magnetic medium (for example, a floppy disk, a hard disk, or a magnetic tape), an optical medium (for example, a DVD), a semiconductor medium (for example, a solid-state drive Solid State Disk), or the like.
(269) A person of ordinary skill in the art can understand that all or some of the processes of the methods in the foregoing embodiments may be implemented by a computer program instructing related hardware. The program may be stored in a computer-readable storage medium. When the program is executed, the processes in the foregoing method embodiments may be included. The storage medium includes any medium that can store computer program code, for example, a ROM, a random access memory RAM, a magnetic disk, or an optical disc.
(270) To sum up, the foregoing descriptions are merely embodiments of the technical solutions of the present invention, but are not intended to limit the protection scope of the present invention. Any modification, equivalent replacement, or improvement made based on the disclosure of the present invention shall fall within the protection scope of the present invention.