APPARATUS FOR GENERATING HIGH-DYNAMIC-RANGE IMAGE, METHOD, AND STORAGE MEDIUM
20190068866 ยท 2019-02-28
Inventors
Cpc classification
H04N23/88
ELECTRICITY
H04N23/741
ELECTRICITY
H04N23/667
ELECTRICITY
International classification
Abstract
An object of the present invention is to generate a high-dynamic-range image with fewer pseudo-contours. An apparatus of the present invention generates a high-dynamic-range image and includes a setting unit for setting an imaging condition for each region that constitutes an image. The apparatus includes a setting unit configured to set an exposure condition for the region and a correcting unit configured to correct the set exposure condition so that a difference between a maximum value and a minimum value out of the set exposure condition is small.
Claims
1. An apparatus for generating a high-dynamic-range image and including a setting unit for setting an imaging condition for each region that constitutes an image, the apparatus comprising: a setting unit configured to set an exposure condition for the region; and a correcting unit configured to correct the set exposure condition so that a difference between a maximum value and a minimum value out of the set exposure condition is small.
2. The apparatus according to claim 1, further comprising a preliminary exposure unit configured to perform preliminary exposure for determining the exposure condition.
3. The apparatus according to claim 1, wherein the setting unit sets a value of the exposure condition for a relatively bright region to be lower than a value of the exposure condition for a relatively dark region.
4. The apparatus according to claim 1, wherein the correcting unit corrects the set exposure condition based on a difference between a maximum value and a minimum value out of the set exposure condition.
5. The apparatus according to claim 4, wherein the correcting unit corrects the set exposure condition so that a difference between a maximum value and a minimum value out of the set exposure condition is equal to or less than an allowable maximum exposure difference.
6. The apparatus according to claim 5, wherein the correcting unit corrects the set exposure condition in accordance with the following expression:
7. The apparatus according to claim 6, wherein the maximum exposure difference denoted as Eth is one value within a range between ? and 1.
8. The apparatus according to claim 1, wherein the imaging condition includes at least one of a lens aperture value, ISO sensitivity, and a shutter speed.
9. The apparatus according to claim 1, wherein the apparatus operates in an image capturing mode of either a first mode representing a broad dynamic range or a second mode placing a priority on a tone representation.
10. The apparatus according to claim 9, wherein in a case where the apparatus operates in the second mode, the correcting unit corrects the set exposure condition, and in a case where the apparatus operates in the first mode, the correcting unit does not correct the set exposure condition.
11. The apparatus according to claim 1, further comprising an imaging unit configured to capture image in accordance with an imaging condition and an exposure condition for the region.
12. The apparatus according to claim 11, wherein in a case where the exposure condition is corrected by the correcting unit, the imaging unit captures image in accordance with an imaging condition to be changed based on the corrected exposure condition.
13. The apparatus according to claim 12, wherein a change of the imaging condition based on the corrected exposure condition is made by changing at least either one of ISO sensitivity and a shutter speed.
14. The apparatus according to claim 13, wherein a change of the imaging condition based on the corrected exposure condition is made by using a table in which values for an exposure condition, ISO sensitivity, and a shutter speed are stored.
15. The apparatus according to claim 12, wherein a change of the imaging condition based on the corrected exposure condition is made in conformance with an image capturing mode, and the image capturing mode includes a mode of placing a priority on a shutter speed and a mode of fixing ISO sensitivity.
16. The apparatus according to claim 11, further comprising a developing unit configured to perform development processing for image data acquired by capturing image by the imaging unit, wherein the development processing includes white balance processing, demosaic processing, and gamma processing.
17. The apparatus according to claim 1, further comprising a setting unit configured to set a reference region where an exposure condition is retained, wherein the correcting unit does not correct an exposure condition of the reference region, and corrects an exposure condition of a region other than the reference region.
18. The apparatus according to claim 1, wherein the exposure condition includes at least any one of an aperture, a shutter speed, and ISO sensitivity.
19. A method for generating a high-dynamic-range image which is performed by an apparatus including a setting unit for setting an imaging condition for each region that constitutes an image, the method comprising the steps of: setting an exposure condition for the region; and correcting the set exposure condition so that a difference between a maximum value and a minimum value out of the set exposure condition is small.
20. A non-transitory computer readable storage medium storing a program for causing a computer to execute each step in a method for generating a high-dynamic-range image which is performed by an apparatus including a setting unit for setting an imaging condition for each region that constitutes an image, the method comprising the steps of: setting an exposure condition for the region; and correcting the set exposure condition so that a difference between a maximum value and a minimum value out of the set exposure condition is small.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0008]
[0009]
[0010]
[0011]
[0012]
[0013]
[0014]
[0015]
DESCRIPTION OF THE EMBODIMENTS
First Embodiment
(Configuration of Image Processing Apparatus)
[0016] A configuration of the image processing apparatus according to the present embodiment will be described below by taking an image capturing apparatus as an example.
[0017]
[0018] An image-capture control unit 205 follows an instruction from the CPU 202 and controls the optical unit 102, that is, to be more specific, makes control such as focusing, opening the shutter, and adjusting the aperture. A control unit 206 controls the start, end, and the like of imaging operation in response to user instructions, that is, the user's pressing of the photographing button 103 and the operation button 105. A character generation unit 207 generates data such as characters and graphics for displaying them on the display unit 104.
[0019] An A/D conversion unit 208 converts the light amount of an object detected by the imaging element 201 into a digital signal value. The image processing unit 209 performs image processing on image data in which each pixel has a digital signal value obtained as a result of conversion by the A/D conversion unit 208. An encoder unit 210 performs conversion processing to convert image data processed by the image processing unit 209 into a file format such as Jpeg. A media I/F 211 is an interface for transmitting/receiving image data to/from a PC/media 213, that is, a PC and/or media. It should be noted that the media described herein includes a hard disc, a memory card, a CF card, an SD card, and the like. A system bus 212 is a bus for transmitting/receiving data.
(Imaging Processing)
[0020] With reference to
[0021] In Step S401, the CPU 202 acquires an imaging condition that has been inputted through a user's operation of the operation button 105. This imaging condition includes at least one of a lens aperture value, a shutter speed, and ISO sensitivity.
[0022] In Step S402, the CPU 202 acquires information (referred to as image capturing mode information) indicating an image capturing mode set by the user.
[0023] In Step S403, the CPU 202 determines whether the photographing button 103 is pressed by the user. In the case where the determination result in Step S403 is true, the process advances to Step S404. Meanwhile, in the case where the determination result in Step S403 is false, the process returns to Step S401.
[0024] In Step S404, the CPU 202 causes the image-capture control unit 205 to drive the optical unit 102 to perform preliminary exposure, thereby determining the exposure condition for each region of the imaging element 201.
[0025]
[0026]
[0027] In Step S405, the CPU 202 determines whether the HDR image capturing mode that has been specified through image capturing mode information acquired in Step S402 is the D-range-oriented mode. In the case where the determination result in Step S405 is true, the process advances to Step S407. Meanwhile, in the case where the determination result in Step S405 is false, the process advances to Step S406.
[0028] In Step S406, the CPU 202 executes exposure correction processing to correct an exposure condition for each region that has been set in Step S404. Incidentally, the details of the exposure correction processing will be described later with reference to
[0029] In Step S407, the image-capture control unit 205 in which an instruction has been received from the CPU 202 drives the optical unit 102 in accordance with the imaging condition that has been acquired in Step S401 and the exposure condition that has been set at least in either one of Step S404 and Step S406 so as to shoot the object. As a result, the light amount of the object is obtained and is detected by the imaging element 201, that is, the HDR sensor. Further, image data in which each pixel has light amount expressed in a pixel value is converted into RAW image data in the A/D conversion unit 208. Incidentally, the RAW image data described herein is image data that has only one color out of R, G, and B for each pixel.
[0030] In Step S408, the image processing unit 209 applies development processing on the RAW image data and generates RGB image (image data of a bitmap type in which each pixel has three channels, that is, pixel values of R, G, and B). In general, development processing which generates RGB image from RAW image data includes white balance processing, demosaic processing, and gamma processing.
[0031] In Step S409, the CPU 202 outputs image data to which the development processing has been applied in Step S408. For instance, the CPU 202 causes the display unit 104 to display image data, or causes the encoder unit 210 to convert such image data into a file format such as Jpeg so as to output the resultant to the PC/media 213, that is, an external PC and/or media, via the media I/F 211.
[0032] As such, according to the present embodiment, processing is switched in accordance with the HDR image capturing mode. In the case where the HDR image capturing mode is the tone-oriented mode, exposure correction processing is performed. Meanwhile, in the case where the HDR image capturing mode is the D-range-oriented mode, the imaging processing is performed without performing the exposure correction processing. Accordingly, in the case where the user takes a photograph in the tone-oriented mode, it is possible to suppress the occurrence of pseudo-contours such as tone inversion which is likely to occur in the case where the dynamic range of an object is broad. The above is the content of imaging processing in the present embodiment.
(Exposure Correction Processing)
[0033] With reference to
[0034] In Step S701, the CPU 202 scans each region in the processor element array layer 302 to derive a maximum value (denoted as Emax) and a minimum value (denoted as Emin) among the exposure conditions of the regions. In the case of
[0035] In Step S702, the CPU 202 determines whether a difference between Emax and Emin is larger than Eth obtained in Step S402 based on Emax and Emin that have been derived from Step S701. In the case where the determination result in Step S702 is true, the process advances to Step S703. Meanwhile, in the case where the determination result in Step S702 is false, the exposure correction processing ends.
[0036] In Step S703, the CPU 202 focuses on one unprocessed region in the processor element array layer 302. In this step, one region to be focused on is described as a region of interest.
[0037] In Step S704, the CPU 202 corrects an exposure condition of a region of interest so that the difference between a maximum value and a minimum value among exposure conditions within an image becomes equal to or lower than a maximum exposure difference (equal to or lower than Eth). A correction of the exposure condition in this step is made, for example, by following Equation (1) below.
[0038] In Equation (1), E represents an exposure condition after correction, and E represents an exposure condition before correction.
[0039] Further, in a case of fixing an exposure of a region where exposure condition is other than 0, the exposure may be fixed such that a region that retains an exposure condition (referred to as a reference region) is set, and based on a corrected value obtained by calculation using Equation (1) and an exposure condition of the reference region, an exposure condition of a region where an exposure is unfixed may be shifted. In
[0040] In Step S705, the CPU 202 determines an imaging condition for a region of interest in accordance with the exposure condition that has been corrected in the previous Step S704. An exposure is typically controlled by adjusting an aperture, a shutter speed, and an analog gain. However, in the present embodiment, the exposure is controlled by fixing an aperture and by adjusting at least either one of a shutter speed and an analog gain which are locally adjustable. It should be noted that the degree of an analog gain for the HDR sensor is in proportion to that of ISO sensitivity, and accordingly, the case of controlling ISO sensitivity will be described below.
[0041] In Step S706, the CPU 202 determines whether series of processing of Step S703 through Step S705 are performed for all the regions. In the case where the determination result in this step is true, the series of processing end. Meanwhile, in the case where the determination result in this step is false, the process returns to Step S703. The above is the content of the exposure correction processing in the present embodiment.
Other Embodiments
[0042] Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a non-transitory computer-readable storage medium) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)?), a flash memory device, a memory card, and the like.
[0043] According to the present invention, it is possible to generate a high-dynamic-range image having fewer pseudo-contours.
[0044] While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
[0045] This application claims the benefit of Japanese Patent Application No. 2017-164824 filed Aug. 29, 2017, which is hereby incorporated by reference wherein in its entirety.