IMAGE PROCESSING DEVICE, ELECTRONIC EQUIPMENT, IMAGE PROCESSING METHOD, AND PROGRAM
20220360702 · 2022-11-10
Inventors
Cpc classification
H04N23/81
ELECTRICITY
G01S17/894
PHYSICS
G01S7/4918
PHYSICS
International classification
Abstract
An image processing device includes an image generation unit (212) that generates, in an IR image frame, a first IR image captured in a state in which a pulse wave is on and a second IR image captured in a state in which the pulse wave is off, and an image correction unit (213) that corrects the first IR image on the basis of the second IR image.
Claims
1. An image processing device comprising: an image generation unit that generates, in an IR image frame, a first IR image captured in a state in which a pulse wave is on and a second IR image captured in a state in which the pulse wave is off; and an image correction unit that corrects the first IR image on a basis of the second IR image.
2. The image processing device according to claim 1, wherein the IR image frame includes a phase of generating the first IR image and a phase of generating the second IR image.
3. The image processing device according to claim 1, wherein the image correction unit removes background light and a dark component included in the first IR image on a basis of the second IR image.
4. The image processing device according to claim 1, wherein the image correction unit individually adjusts exposure time of a TOF sensor in each frame.
5. The image processing device according to claim 4, wherein the image correction unit individually adjusts the exposure time of the TOF sensor in each of the IR image frame and a depth image frame.
6. The image processing device according to claim 5, wherein the image correction unit controls the exposure time in a phase included in the IR image frame to be longer than the exposure time in a phase included in the depth image frame.
7. The image processing device according to claim 4, wherein the image correction unit individually adjusts the exposure time of the TOF sensor in each of the IR image frame, a depth image frame, and an eye-gaze detection frame.
8. The image processing device according to claim 7, wherein the image correction unit performs control in such a manner that the exposure time in a phase included in the IR image frame, the exposure time in a phase included in the eye-gaze detection frame, and the exposure time in a phase included the depth image frame are lengthened in this order.
9. The image processing device according to claim 1, further comprising a correction selecting unit that selects a correction method according to a positional relationship between a subject included in the first IR image and a light source.
10. The image processing device according to claim 9, wherein the correction selecting unit selects the correction method according to a distance between the subject and the light source.
11. The image processing device according to claim 10, wherein the correction selecting unit selects, for the first IR image, either a correction based on the second IR image or a correction based on a dark image stored in advance in a storage unit according to the distance between the subject and the light source.
12. The image processing device according to claim 11, wherein the correction selecting unit selects, for the first IR image, the correction based on the second IR image in a case where the distance between the subject and the light source is equal to or shorter than a threshold, and selects the correction based on the dark image in a case where the distance between the subject and the light source exceeds the threshold.
13. The image processing device according to claim 12, wherein the subject is a face of a person, and the light source is sun.
14. Electronic equipment comprising: a TOF sensor; an image generation unit that generates, on a basis of an output from the TOF sensor, a first IR image captured in a state in which a pulse wave is on and a second IR image captured in a state in which the pulse wave is off in an IR image frame; and an image correction unit that corrects the first IR image on a basis of the second IR image.
15. An image processing method comprising: generating, in an IR image frame, a first IR image captured in a state in which a pulse wave is on and a second IR image captured in a state in which the pulse wave is off; and correcting the first IR image on a basis of the second IR image.
16. A program causing a computer to function as an image generation unit that generates, in an IR image frame, a first IR image captured in a state in which a pulse wave is on and a second IR image captured in a state in which the pulse wave is off, and an image correction unit that corrects the first IR image on a basis of the second IR image.
Description
BRIEF DESCRIPTION OF DRAWINGS
[0009]
[0010]
[0011]
[0012]
[0013]
[0014]
[0015]
[0016]
[0017]
[0018]
[0019]
[0020]
[0021]
[0022]
[0023]
[0024]
[0025]
[0026]
[0027]
[0028]
[0029]
[0030]
[0031]
[0032]
[0033]
[0034]
[0035]
DESCRIPTION OF EMBODIMENTS
[0036] In the following, embodiments of the present disclosure will be described in detail on the basis of the drawings. Note that in the following embodiments, overlapped description is omitted by assignment of the same reference sign to identical parts.
[0037] The present disclosure will be described in the following order of items.
[0038] 1. Configuration of electronic equipment
[0039] 1-1. Frame configuration
[0040] 1-2. Indirect TOF method
[0041] 1-3. System configuration of indirect TOF distance image sensor
[0042] 1-4. Circuit configuration of pixel in indirect TOF distance image sensor
[0043] 2. First embodiment
[0044] 2-1. Configuration of image processing device
[0045] 2-2. Image processing method
[0046] 2-3. Frame configuration
[0047] 3. Second embodiment
[0048] 3-1. Image processing device
[0049] 3-2. Correction selecting method
[0050] 3-3. Processing of correction selecting method
[0051] 4. Modification example of second embodiment
1. Configuration of Electronic Equipment
[0052] The present disclosure can be suitably applied to a technology of correcting an IR image acquired by photographing an object with a TOF sensor. Thus, first, an indirect TOF method will be described in order to make it easy to understand the present disclosure. The indirect TOF method is a technology of emitting source light (such as laser light in an infrared region) modulated by, for example, pulse width modulation (PWM) to an object, receiving reflected light thereof with a light receiving element, and performing ranging with respect to an object to be measured on the basis of a phase difference in the received reflected light.
[0053] An example of a configuration of electronic equipment according to an embodiment of the present disclosure will be described with reference to
[0054] As illustrated in
[0055] The imaging device 10 includes a light source unit 11, a light receiving unit 12, and an imaging processing unit 13.
[0056] The light source unit 11 includes, for example, a light emitting element that emits light having a wavelength of an infrared region, and a drive circuit that drives the light emitting element to emit light. The light emitting element can be realized by, for example, a light emitting diode (LED). Note that the light emitting element is not limited to the LED, and may be realized by, for example, a vertical cavity surface emitting laser (VCSEL) in which a plurality of light emitting elements is formed in an array.
[0057] The light receiving unit 12 includes, for example, a light receiving element capable of detecting light having the wavelength of the infrared region, and a signal processing circuit that outputs a pixel signal corresponding to the light detected by the light receiving element. The light receiving element can be realized by, for example, a photodiode. Note that the light receiving element is not limited to a photodiode, and may be realized by other elements.
[0058] The imaging processing unit 13 executes various kinds of imaging processing, for example, in response to an imaging instruction from the image processing device 20. For example, the imaging processing unit 13 generates a light source control signal to drive the light source unit 11 and performs an output thereof to the light source unit 11.
[0059] The imaging processing unit 13 controls light reception by the light receiving unit 12 in synchronization with the light source control signal supplied to the light source unit 11. For example, the imaging processing unit 13 generates an exposure control signal to control exposure time of the light receiving unit 12 in synchronization with the light source control signal, and performs an output thereof to the light receiving unit 12. The light receiving unit 12 performs exposure for an exposure period indicated by the exposure control signal, and outputs a pixel signal to the imaging processing unit 13.
[0060] The imaging processing unit 13 calculates distance information on the basis of the pixel signal output from the light receiving unit 12. The imaging processing unit 13 may generate predetermined image information on the basis of this pixel signal. The imaging processing unit 13 outputs the generated distance information and image information to the image processing device 20.
[0061] For example, the imaging processing unit 13 generates a light source control signal to drive the light source unit 11 according to an instruction to execute imaging from the image processing device 20, and supplies the light source control signal to the light source unit 11. Here, the imaging processing unit 13 generates a light source control signal, which is modulated by the PWM into a rectangular wave having a predetermined duty, and supplies the light source control signal to the light source unit 11. At the same time, the imaging processing unit 13 controls light reception by the light receiving unit 12 on the basis of an exposure control signal synchronized with the light source control signal.
[0062] In the imaging device 10, the light source unit 11 blinks and emits light according to the predetermined duty in response to the light source control signal generated by the imaging processing unit 13. The light emitted from the light source unit 11 is emitted as emission light 30 from the light source unit 11. The emission light 30 is reflected by an object 31 and received by the light receiving unit 12 as reflected light 32, for example. The light receiving unit 12 generates a pixel signal corresponding to the reception of the reflected light 32 and performs an output thereof to the imaging processing unit 13. Note that in practice, the light receiving unit 12 also receives background light (ambient light) of a periphery in addition to the reflected light 32, and the pixel signal includes this background light and a dark component due to the light receiving unit 12 together with a component of the reflected light 32.
[0063] Also, in the present embodiment, the imaging device 10 images the object 31 in a state in which the light source unit 11 is off and does not emit light. Then, the light receiving unit 12 receives background light around the object 31. In this case, a pixel signal generated by the light receiving unit 12 includes only the background light and the dark component caused by the light receiving unit 12.
[0064] The imaging processing unit 13 executes light reception by the light receiving unit 12 for a plurality of times at different phases. The imaging processing unit 13 calculates a distance D to the object 31 on the basis of a difference between pixel signals due to the light reception at the different phases. The imaging processing unit 13 calculates image information acquired by extraction of the component of the reflected light 32 on the basis of the difference between the pixel signals, and image information including the component of the reflected light 32 and a component of the ambient light. In the following, the image information acquired by extraction of the component of the reflected light 32 on the basis of the difference between the pixel signals is referred to as direct reflected light information, and image information including the component of the reflected light 32 and the component of the ambient light is referred to as RAW image information.
[0065] (1-1. Frame Configuration)
[0066] A configuration of a frame used for imaging by the imaging device 10 will be described with reference to
[0067] As illustrated in
[0068] One microframe includes a plurality of phases such as a first phase, a second phase, a third phase, a fourth phase, a fifth phase, a sixth phase, a seventh phase, and an eighth phase. One microframe can include eight phases at a maximum. Thus, processing of the plurality of phases can be executed within one microframe period. Note that a dead time period is provided at the end of each microframe in order to prevent interference with processing of a next microframe.
[0069] In the present embodiment, an object can be imaged in one phase. As illustrated in
[0070] (1-2. Indirect TOF Method)
[0071] A principle of the indirect TOF method will be described with reference to
[0072] In
[0073] The imaging processing unit 13 performs a plurality of times of sampling with respect to a pixel signal of received reflected light 32 at different phases, and acquires a light quantity value indicating a light quantity each time of the sampling. In the example of
[0074] (1-3. System Configuration of Indirect TOF Distance Image Sensor)
[0075] An example of a system configuration of an indirect TOF image sensor according to the present disclosure will be described with reference to
[0076] As illustrated in
[0077] A pixel array portion 10020 is formed on the sensor chip 10001. The pixel array portion 10020 includes a plurality of pixels 10230 arranged in a matrix (array) in a two-dimensional grid pattern on the sensor chip 10001. In the pixel array portion 10020, each of the plurality of pixels 10230 receives infrared light, performs photoelectric conversion, and outputs an analog pixel signal. In the pixel array portion 10020, two vertical signal lines VSL.sub.1 and VSL.sub.2 are wired for each pixel column. When it is assumed that the number of pixel columns in the pixel array portion 10020 is M (M is an integer), 2×M vertical signal lines VSL are wired in total on the pixel array portion 10020.
[0078] Each of the plurality of pixels 10230 has two taps A and B (details thereof will be described later). In the two vertical signal lines VSL.sub.1 and VSL.sub.2, a pixel signal AIN.sub.P1 based on a charge of a tap A of a pixel 10230 in a corresponding pixel column is output to the vertical signal line VSL.sub.1, and a pixel signal AIN.sub.P2 based on a charge of a tap B of the pixel 10230 in the corresponding pixel column is output to the vertical signal line VSL.sub.2. The pixel signals AIN.sub.P1 and AIN.sub.P2 will be described later.
[0079] A vertical drive circuit 10010, a column signal processing unit 10040, an output circuit unit 10060, and a timing control unit 10050 are arranged on the circuit chip 10002. The vertical drive circuit 10010 drives each pixel 10230 of the pixel array portion 10020 in a unit of a pixel row and causes the pixel signals AIN.sub.P1 and AIN.sub.P2 to be output. Under the driving by the vertical drive circuit 10010, the pixel signals AIN.sub.P1 and AIN.sub.P2 output from the pixels 10230 in the selected row are supplied to the column signal processing unit 10040 through the vertical signal lines VSL.sub.1 and VSL.sub.2.
[0080] The column signal processing unit 10040 has a configuration including, in a manner corresponding to the pixel columns of the pixel array portion 10020, a plurality of ADCs (corresponding to column AD circuit described above) respectively provided for the pixel columns, for example. Each ADC performs AD conversion processing on the pixel signals AIN.sub.P1 and AIN.sub.P2 supplied through the vertical signal lines VSL.sub.1 and VSL.sub.2, and performs an output thereof to the output circuit unit 10060. The output circuit unit 10060 executes CDS processing or the like on the digitized pixel signals AIN.sub.P1 and AIN.sub.P2 output from the column signal processing unit 10040, and performs an output thereof to the outside of the circuit chip 10002.
[0081] The timing control unit 10050 generates various timing signals, clock signals, control signals, and the like. Drive control of the vertical drive circuit 10010, the column signal processing unit 10040, the output circuit unit 10060, and the like is performed on the basis of these signals.
[0082] (1-4. Circuit Configuration of Pixel in Indirect TOF Distance Image Sensor)
[0083]
[0084] A pixel 10230 according to the present example includes, for example, a photodiode 10231 as a photoelectric conversion unit. In addition to the photodiode 10231, the pixel 10230 includes an overflow transistor 10242, two transfer transistors 10232 and 10237, two reset transistors 10233 and 10238, two floating diffusion layers 10234 and 10239, two amplifier transistors 10235 and 10240, and two selection transistors 10236 and 10241. The two floating diffusion layers 10234 and 10239 correspond to the taps A and B illustrated in
[0085] The photodiode 10231 photoelectrically converts received light and generates a charge. The photodiode 10231 can have a back-illuminated pixel structure. The back-illuminated structure is as described in the pixel structure of the CMOS image sensor. However, the back-illuminated structure is not a limitation, and a front-illuminated structure in which light emitted from a side of a front surface of a substrate is captured may be employed.
[0086] The overflow transistor 10242 is connected between a cathode electrode of the photodiode 10231 and a power-supply line of a power supply voltage VDD, and has a function of resetting the photodiode 10231. Specifically, the overflow transistor 10242 sequentially discharges the charge of the photodiode 10231 to the power-supply line by turning into a conduction state in response to an overflow gate signal OFG supplied from the vertical drive circuit 10010.
[0087] The two transfer transistors 10232 and 10237 are connected between the cathode electrode of the photodiode 10231 and the two floating diffusion layers 10234 and 10239, respectively. Then, the transfer transistors 10232 and 10237 sequentially transfer the charges generated in the photodiode 10231 to the floating diffusion layers 10234 and 10239 respectively by turning into the conduction state in response to a transfer signal TRG supplied from the vertical drive circuit 10010.
[0088] The floating diffusion layers 10234 and 10239 corresponding to the taps A and B accumulate the charges transferred from the photodiode 10231, convert the charges into voltage signals having voltage values corresponding to the charge amounts, and generate the pixel signals AIN.sub.P1 and AIN.sub.P2.
[0089] The two reset transistors 10233 and 10238 are connected between the power-supply line of the power supply voltage VDD and the two floating diffusion layers 10234 and 10239, respectively. Then, by turning into the conduction state in response to a reset signal RST supplied from the vertical drive circuit 10010, the reset transistors 10233 and 10238 respectively extract the charges from the floating diffusion layers 10234 and 10239 and initialize the charge amounts.
[0090] The two amplifier transistors 10235 and 10240 are respectively connected between the power-supply line of the power supply voltage VDD and the two selection transistors 10236 and 10241, and respectively amplify the voltage signals on which charge-voltage conversion is respectively performed in the floating diffusion layers 10234 and 10239.
[0091] The two selection transistors 10236 and 10241 are connected between the two amplifier transistors 10235 and 10240 and the vertical signal lines VSL.sub.1 and VSL.sub.2, respectively. Then, by turning into the conduction state in response to a selection signal SEL supplied from the vertical drive circuit 10010, the selection transistors 10236 and 10241 respectively output the voltage signals respectively amplified in the amplifier transistors 10235 and 10240 to the two vertical signal lines VSL.sub.1 and VSL.sub.2 as the pixel signals AIN.sub.P1 and AIN.sub.P2.
[0092] The two vertical signal lines VSL.sub.1 and VSL.sub.2 are connected, for each pixel column, to an input end of one ADC in the column signal processing unit 10040, and transmit the pixel signals AIN.sub.P1 and AIN.sub.P2 output from the pixels 10230 in each pixel column to the ADC.
[0093] Note that the circuit configuration of the pixel 10230 is not limited to the circuit configuration illustrated in
2. First Embodiment
[0094] (2-1. Image Processing Device)
[0095] A configuration of an image processing device 20 according to the first embodiment of the present disclosure will be described with reference to
[0096] As illustrated in
[0097] The IR image processing device 210 executes processing of correcting an IR image, and the like. The depth image processing device 220 executes processing of calculating depth, and the like. The IR image processing device 210 and the depth image processing device 220 execute processing in parallel.
[0098] The storage unit 230 stores various kinds of information. The storage unit 230 stores, for example, a dark image to correct an IR image. The storage unit 230 is realized by a semiconductor memory element such as a random access memory (RAM) or a flash memory, or a storage device such as a hard disk or an optical disk, for example.
[0099] The IR image processing device 210 includes an acquisition unit 211, an IR image generation unit 212, an image correction unit 213, a normalization unit 214, a reference unit 215, a first exposure time calculation unit 216, and a second exposure time calculation unit 217.
[0100] The acquisition unit 211 acquires various kinds of information from an imaging device 10. The acquisition unit 211 acquires, for example, RAW image information related to an object imaged by the imaging device 10. For example, the acquisition unit 211 selectively acquires RAW image information of each phase included in a microframe. For example, in order to correct an IR image, the acquisition unit 211 acquires RAW image information related to the object imaged in a state in which a light source unit 11 is on, and RAW image information related to an object portion imaged in a state in which the light source unit 11 is off. The acquisition unit 211 outputs the acquired RAW image information to the IR image generation unit 212.
[0101] The IR image generation unit 212 generates an IR image on the basis of the RAW image information received from the acquisition unit 211. For example, the IR image generation unit 212 may generate an IR image resolution of which is converted into what is suitable for face authentication. The IR image generation unit 212 outputs the generated IR image to the image correction unit 213.
[0102] The image correction unit 213 executes various kinds of correction processing on the IR image received from the IR image generation unit 212. The image correction unit 213 executes correction processing in such a manner that the IR image becomes suitable for face authentication of a person included therein. For example, on the basis of the dark image stored in the storage unit 230, the image correction unit 213 executes an FPN correction on the IR image received from the IR image generation unit 212. For example, on the basis of an IR image related to the object imaged in a state in which the light source unit 11 is off (hereinafter, also referred to as light-source-off image), the image correction unit 213 executes the FPN correction on an IR image related to the object imaged in a state in which the light source unit 11 is on.
[0103] (2-2. Image Processing Method)
[0104] A principle of a method of executing the FPN correction on the basis of pieces of RAW image information captured in a state in which the light source is on and in a state in which the light source is off will be described with reference to
[0105]
[0106] An example illustrated in
[0107] In the example illustrated in
A=G.sub.A(S+Amb)+D.sub.A (1)
B=G.sub.B(P−S+Amb)+D.sub.B (2)
[0108] In the expression (1) and the expression (2), G.sub.A represents a gain value of the tap A, G.sub.B represents a gain value of the tap B, P represents reflected light, S represents a light quantity of the reflected light received by the tap A, Amb represents background light, D.sub.A represents a dark component of the tap A, and D.sub.B represents a dark component of the tap B.
[0109] That is, the output value from the tap A includes the background light and the dark component of the tap A in addition to the reflected light from the object. Similarly, the output value from the tap B includes the background light and the dark component of the tap B in addition to the reflected light from the object. The imaging device 10 outputs the sum of the pixel signal A and the pixel signal B as RAW image information to the image processing device 20. Thus, the RAW image information output from the imaging device 10 to the image processing device 20 includes an influence of the background light, the dark component of the tap A, and the dark component of the tap B. Thus, it is desirable to remove the influence of the background light, the dark component of the tap A, and the dark component of the tap B in order to accurately perform recognition processing such as face authentication.
[0110]
[0111] As illustrated in
A.sub.Off=G.sub.A(Amb.sub.Off)+D.sub.AOff (3)
B.sub.Off=G.sub.B(Amb.sub.Off)+D.sub.BOff (4)
[0112] In the expression (3) and the expression (4), Amb.sub.Offis the background light of when the light source unit 11 is in the off state, D.sub.AOff is the dark component of the tap A of when the light source unit 11 is in the off state, and D.sub.BOff is the dark component of the tap B of when the light source unit 11 is in the off state. Since the background light and the dark component do not change regardless of whether the state of the light source unit 11 is the on state or the off state, the following relationships hold.
Amb.sub.Off=Amb (5)
D.sub.AOff=D.sub.A (6)
D.sub.BOff=D.sub.B (7)
[0113] Substituting the expression (5) to the expression (7) into the expression (3) and subtracting the expression (3) from the expression (1) give the following relationship.
A−A.sub.Off=SG.sub.A (8)
[0114] Substituting the expression (5) to the expression (7) into the expression (4) and subtracting the expression (4) from the expression (2) give the following relationship.
B−B.sub.OFF=SG.sub.B (9)
[0115] Then, the following relationship is acquired by calculation of the expression (8) and the expression (9).
(A−A.sub.Off)+(B−B.sub.Off)=S(G.sub.A+G.sub.B) (10)
[0116] As described above, the image correction unit 213 can remove the influence of the background light and the dark component on the basis of the pieces of RAW image information captured in a state in which the light source is on and a state in which the light source is off.
[0117] An effect of the FPN correction according to the embodiment of the present disclosure will be described with reference to
[0118]
[0119]
[0120]
[0121]
[0122] The normalization unit 214 normalizes the IR image received from the image correction unit 213. The normalization unit 214 outputs the normalized IR image to the outside. As a result, an IR image suitable for the face recognition processing is provided to the user.
[0123] The reference unit 215 receives, for example, depth calculated by a depth calculation unit 222. For example, the reference unit 215 receives accuracy of the depth. The reference unit 215 generates a mask image on the basis of the depth and the accuracy of the depth. Here, the mask image is, for example, an image acquired by masking of a subject other than the object included in a depth image. The reference unit 215 outputs the generated mask image to the first exposure time calculation unit 216 and the second exposure time calculation unit 217.
[0124] On the basis of the corrected IR image received from the image correction unit 213 and the mask image received from the reference unit 215, the first exposure time calculation unit 216 calculates exposure time in imaging to generate an IR image. As a result, optimal exposure time for generating the IR image is calculated.
[0125] On the basis of the mask image received from the reference unit 215 and the accuracy of the depth received from the depth calculation unit 222, the second exposure time calculation unit 217 calculates the exposure time in imaging to calculate the depth.
[0126] The depth image processing device 220 includes an acquisition unit 221 and the depth calculation unit 222.
[0127] The acquisition unit 221 acquires various kinds of information from the imaging device 10. The acquisition unit 221 acquires, for example, RAW image information related to the object imaged by the imaging device 10. For example, the acquisition unit 221 selectively acquires RAW image information of each phase included in a microframe. For example, the acquisition unit 221 acquires RAW image information of four phases which information is captured at phases of 0°, 90°, 180°, and 270° in order to generate a depth image. The acquisition unit 221 outputs the acquired RAW image information to the depth calculation unit 222.
[0128] For example, the depth calculation unit 222 calculates depth on the basis of the RAW image information of the four phases which information is received from the acquisition unit 221. The depth calculation unit 222 calculates, for example, accuracy on the basis of the calculated depth. For example, the depth calculation unit 222 may generate the depth image on the basis of the calculated depth. The depth calculation unit 222 outputs the calculated depth to the outside. As a result, distance information to the object can be acquired. Also, the depth calculation unit 222 outputs the calculated depth and accuracy to the reference unit 215.
[0129] (2-3. Frame Configuration)
[0130] A frame configuration used for imaging according to the embodiment of the present disclosure will be described with reference to
[0131] As illustrated in
[0132] The IR image microframe includes, for example, two phases that are a phase A0 and a phase A1. The phase A0 is, for example, a phase in which the object is imaged in a state in which the light source unit 11 is off. The phase A1 is, for example, a phase in which the object is imaged in a state in which the light source unit 11 is on.
[0133] The depth image microframe includes, for example, four phases that are a phase B0, a phase B1, a phase B2, and a phase B3. The phase B0 is, for example, a phase in which the object is imaged when a phase difference between emission light to the object and reflected light from the object is 0°. The phase B1 is, for example, a phase in which the object is imaged when the phase difference between the emission light to the object and the reflected light from the object is 90°. The phase B2 is, for example, a phase in which the object is imaged when the phase difference between the emission light to the object and the reflected light from the object is 180°. The phase B3 is, for example, a phase in which the object is imaged when the phase difference between the emission light to the object and the reflected light from the object is 270°.
[0134] In the frame F1, exposure time in the IR image microframe and the depth image microframe can be individually adjusted (automatic exposure (AE)). For example, the exposure time may be adjusted to be long in the IR image microframe in order to secure brightness, and the exposure time may be adjusted to be short in the depth image microframe in order to control power consumption. In this case, the exposure time in each of the phase A0 and the phase A1 of the IR image microframe may be adjusted to 1 ms, for example. Also, the exposure time in each of the phase B0, the phase B1, the phase B2, and the phase B3 of the depth image microframe may be adjusted to 500 μs, for example. Note that the exposure time in each phase is not limited to these.
[0135] As illustrated in
[0136] The eye-gaze detection microframe includes, for example, two phases that are a phase C0 and a phase C1. The phase C0 is, for example, a phase in which the object is imaged in a state in which the light source unit 11 is off. The phase C1 is, for example, a phase in which the object is imaged in a state in which the light source unit 11 is on.
[0137] In the frame F2, exposure time in the IR image microframe, the depth image microframe, and the eye-gaze detection microframe can be individually adjusted. For example, in a case where a person to be photographed wears glasses, there is a case where light is reflected by the glasses and an eye-gaze cannot be detected when eye-gaze detection necessary for face authentication is performed. Thus, in the eye-gaze detection microframe, the exposure time may be adjusted to be shorter than those of the IR image microframe and the depth image microframe in such a manner that light is not reflected by the glasses. For example, the exposure time in each of the phase C0 and the phase C1 of the eye-gaze detection microframe may be adjusted to, for example, 200 μs. Note that the exposure time in each of the phase C0 and the phase C1 is not limited to this.
[0138] As described above, in the first embodiment, an IR image captured in the environment with strong background light such as the sun is corrected on the basis of an IR image captured in a state in which the light source is off, whereby an influence of the sun can be removed. As a result, the recognition accuracy of the face authentication using the IR image captured by the TOF, or the like can be improved.
3. Second Embodiment
[0139] Correction method selection processing according to the second embodiment of the present disclosure will be described.
[0140] As described above, when face authentication is performed by utilization of an IR image including strong light such as sunlight, it is possible to remove an influence of the sunlight by using a light-source-off image instead of a dark image. Thus, a recognition rate can be improved. However, for example, when the IR image is corrected by utilization of the light-source-off image in a situation such as the interior in which situation there is a small quantity of ambient light, there is a possibility that a contrast of the image becomes small and the recognition rate is decreased. Thus, it is preferable to perform switching between a correction using the dark image and a correction using the light-source-off image according to intensity of background light.
[0141] (3-1. Image Processing Device)
[0142] A configuration of an image processing device according to the second embodiment of the present disclosure will be described with reference to
[0143] As illustrated in
[0144] The correction selecting unit 218 selects a method of a correction with respect to an IR image. The correction selecting unit 218 receives information related to depth from a reference unit 215, for example. For example, the correction selecting unit 218 receives, from an image correction unit 213, an IR image corrected on the basis of a light-source-off image. The correction selecting unit 218 selects a correction method on the basis of the IR image received from the image correction unit 213 and the information that is related to the depth and received from the reference unit 215.
[0145] (3-2. Correction Selecting Method)
[0146] The correction selecting method will be described with reference to
[0147] For example, the correction selecting unit 218 extracts outlines of a head portion H and a body portion B of the person M on the basis of the information that is related to the depth and received from the reference unit 215. For example, the correction selecting unit 218 calculates a center of gravity G .sub.M of the person M on the basis of the extracted outlines.
[0148] For example, on the basis of the IR image received from the image correction unit 213, the correction selecting unit 218 assumes a region, in which a light quantity is saturated, as the sun S and extracts an outline thereof. For example, the correction selecting unit 218 calculates a center of gravity G.sub.S of the sun S on the basis of the extracted outline.
[0149] The correction selecting unit 218 draws a straight line L1 connecting the center of gravity G.sub.M and the center of gravity G.sub.S. The correction selecting unit 218 draws an orthogonal line O that passes through the center of gravity G.sub.S and that is orthogonal to the straight line L. For example, the correction selecting unit 218 draws N straight lines (N is an integer equal to or larger than 2) such as a straight line L2 and a straight line L3 drawn from the straight line L1 toward the person M at an angle θ with the center of gravity G.sub.S as an origin within a range of ±90 degrees from the straight line L1.
[0150] The correction selecting unit 218 extracts contact points between the straight lines drawn toward the person M and the outline of the person M. For example, the correction selecting unit 218 extracts a contact point I1 between the straight line L1 and the outline of the person M, a contact point I2 between the straight line L2 and the outline of the person M, and a contact point I3 between the straight line L3 and the outline of the person M.
[0151] The correction selecting unit 218 calculates a distance from the center of gravity G.sub.S to the outline of the person. For example, the correction selecting unit 218 calculates a distance from the center of gravity G.sub.S to the contact point I1. For example, the correction selecting unit 218 calculates a distance from the center of gravity G.sub.S to the contact point I2. For example, the correction selecting unit 218 calculates a distance from the center of gravity G.sub.S to the contact point I3. The correction selecting unit 218 sets the shortest one among the calculated distances as the shortest distance. In the example illustrated in
[0152] For example, when the shortest distance is equal to or shorter than a predetermined value set in advance, the correction selecting unit 218 determines that the sun is close, and selects a correction using a light-source-off image. For example, in a case where the shortest distance exceeds the predetermined value set in advance or it is determined that there is no sun, the correction selecting unit 218 selects a correction using a dark image stored in advance in a storage unit 230.
[0153] Note that, as illustrated in
[0154] Effects of corrections selected by the correction selecting method according to the second embodiment of the present disclosure will be described with reference to
[0155] An IR image IM2 illustrated in
[0156] An IR image IM2A illustrated in
[0157] An IR image IM3 illustrated in
[0158] An IR image IM3A illustrated in
[0159] An IR image IM4 illustrated in
[0160] An IR image IM4A illustrated in
[0161] An IR image IM5 illustrated in
[0162] An IR image IM5A illustrated in
[0163] (3-3. Processing of Correction Selecting Method)
[0164] A flow of the processing of the correction selecting method according to the second embodiment of the present disclosure will be described with reference to
[0165] First, on the basis of information related to depth, the correction selecting unit 218 extracts an outline of a person included in an IR image to be corrected (Step S101). Then, the processing proceeds to Step S102.
[0166] The correction selecting unit 218 calculates a center of gravity of the person on the basis of the outline of the person which outline is extracted in Step S101 (Step S102). Then, the processing proceeds to Step S103.
[0167] The correction selecting unit 218 extracts an outline of the sun on the basis of a region with a saturated light quantity in the IR image to be corrected (Step S103). Then, the processing proceeds to Step S104.
[0168] The correction selecting unit 218 calculates a center of gravity of the sun on the basis of the outline of the sun which outline is extracted in Step S103 (Step S104). Then, the processing proceeds to Step S105.
[0169] The correction selecting unit 218 draws a straight line connecting the center of gravity of the person which center of gravity is calculated in Step S102 and the center of gravity of the sun which center of gravity is calculated in Step S104 (Step S105). Then, the processing proceeds to Step S106.
[0170] The correction selecting unit 218 draws a plurality of straight lines to the person from the center of gravity of the sun (Step S106). Specifically, the correction selecting unit 218 draws a plurality of straight lines within a range of ±90 degrees from the center of gravity of the sun to the straight line drawn in Step S105. Then, the processing proceeds to Step S107.
[0171] The correction selecting unit 218 calculates a distance to an intersection of each of the straight lines drawn from the center of gravity of the sun in Step S106 with the outline of the person (Step S107). Then, the processing proceeds to Step S108.
[0172] The correction selecting unit 218 determines whether the shortest distance of the straight lines drawn from the center of gravity of the sun to the outline of the person is equal to or shorter than a predetermined value (Step S108). In a case where it is determined that the shortest distance is equal to or shorter than the predetermined value (Yes in Step S108), the processing proceeds to Step S109. In a case where it is determined that the shortest distance is not equal to or shorter than the predetermined value (No in Step S108), the processing proceeds to Step S110.
[0173] In a case where it is determined as Yes in Step S108, the correction selecting unit 218 selects a correction using a light-source-off image (Step S109). Then, the processing of
[0174] On the other hand, in a case where it is determined as No in Step S108, the correction selecting unit 218 selects a correction using a dark image (Step S110). Then, the processing of
[0175] As described above, in the second embodiment, a correction for an IR image can be appropriately selected according to a distance between a person and the sun. Thus, a recognition rate of face authentication or the like can be improved.
4. Modification Example of Second Embodiment
[0176] A modification example of the second embodiment of the present disclosure will be described with reference to
[0177] As described above, in the second embodiment, a correction method is selected on the basis of the shortest distance from a center of gravity of the sun to an outline of a person. For example, since information necessary in face authentication is face information, a correction method may be selected on the basis of the shortest distance from a center of gravity of the sun to an outline of a face of a person in a modification example of the second embodiment.
[0178] As illustrated in
[0179] As illustrated in
[0180] In the example illustrated in
[0181] As described above, in the modification example of the second embodiment, a correction for an IR image can be appropriately selected according to a distance from a face of a person to the sun. Thus, the recognition rate of the face authentication or the like can be further improved.
[0182] (Effect)
[0183] An image processing device 20 according to an aspect of the present disclosure includes an IR image generation unit 212 that generates, in an IR image frame, a first IR image captured in a state in which a pulse wave is on and a second IR image captured in a state in which the pulse wave is off, and an image correction unit 213 that corrects the first IR image on the basis of the second IR image.
[0184] Thus, the IR image captured in a state in which the pulse wave is on can be corrected on the basis of the IR image captured in a state in which the pulse wave is off. As a result, it is possible to remove an influence of strong light such as the sun and to improve a recognition rate.
[0185] Also, the IR image frame may include a phase of generating the first IR image and a phase of generating the second IR image.
[0186] Thus, an IR image in a state in which the pulse wave is on and an IR image in a state in which the pulse wave is off can be generated in one microframe.
[0187] Also, the image correction unit 213 may remove, on the basis of the second IR image, background light and a dark component included in the first IR image.
[0188] Thus, only a component of reflected light can be extracted.
[0189] Also, the image correction unit 213 may individually adjust exposure time of a TOF sensor in each frame.
[0190] Thus, the exposure time in each piece of processing can be appropriately adjusted.
[0191] Also, the image correction unit 213 may individually adjust the exposure time of the TOF sensor in each of the IR image frame and a depth image frame.
[0192] Thus, an IR image and a depth image can be appropriately generated.
[0193] Also, the image correction unit 213 may control the exposure time in a phase included in the IR image frame to be longer than the exposure time in a phase included in the depth image frame.
[0194] Thus, an IR image and a depth image can be appropriately generated, and power consumption can be controlled.
[0195] Also, the image correction unit 213 may individually adjust the exposure time of the TOF sensor in each of the IR image frame, the depth image frame, and an eye-gaze detection frame.
[0196] Thus, an IR image and a depth image can be appropriately generated, and an eye-gaze can be appropriately detected.
[0197] Also, the image correction unit 213 may perform control in such a manner that the exposure time in a phase included in the IR image frame, the exposure time in a phase included in the eye-gaze detection frame, and the exposure time in a phase included in the depth image frame are lengthened in this order.
[0198] Thus, an IR image and a depth image can be generated more appropriately, and an eye-gaze can be detected more appropriately. In addition, power consumption can be controlled.
[0199] A correction selecting unit 218 that selects a correction method according to a positional relationship between a subject included in the first IR image and a light source may be further included.
[0200] Thus, it is possible to select an appropriate correction method according to the positional relationship between the subject and the light source and to improve recognition accuracy.
[0201] The correction selecting unit 218 may select the correction method according to a distance between the subject and the light source.
[0202] Thus, it is possible to select the correction method more according to the distance between the subject and the light source and to further improve the recognition accuracy.
[0203] The correction selecting unit 218 may select, for the first IR image, either a correction based on the second IR image or a correction based on a dark image stored in advance in a storage unit 230 according to the distance between the subject and the light source.
[0204] Thus, it is possible to select a more appropriate correction method according to the distance between the subject and the light source and to further improve the recognition accuracy.
[0205] For the first IR image, the correction selecting unit 218 may select the correction based on the second IR image in a case where the distance between the subject and the light source is equal to or shorter than a threshold, and may select the correction based on the dark image in a case where the distance between the subject and the light source exceeds the threshold.
[0206] As a result, the recognition accuracy is improved since a more appropriate legal method can be selected according to whether the distance between the subject and the light source exceeds the threshold.
[0207] The subject may be a face of a person and the light source may be the sun.
[0208] Thus, it is possible to improve accuracy of face authentication in the outside where an influence of sunlight is strong.
[0209] Electronic equipment 1 of an aspect of the present disclosure includes a TOF sensor, an IR image generation unit 212 that generates a first IR image captured in a state in which a pulse wave is on and a second IR image captured in a state in which the pulse wave is off in an IR image frame on the basis of an output from the TOF sensor, and an image correction unit 213 that corrects the first IR image on the basis of the second IR image.
[0210] Thus, the IR image captured in a state in which the pulse wave is on can be corrected on the basis of the IR image captured in a state in which the pulse wave is off. As a result, it is possible to remove an influence of strong light such as the sun and to improve a recognition rate.
[0211] In an image processing method of an aspect of the present disclosure, a first IR image captured in a state in which a pulse wave is on and a second IR image captured in a state in which the pulse wave is off are generated in an IR image frame, and the first IR image is corrected on the basis of the second IR image.
[0212] Thus, the IR image captured in a state in which the pulse wave is on can be corrected on the basis of the IR image captured in a state in which the pulse wave is off. As a result, it is possible to remove an influence of strong light such as the sun and to improve a recognition rate.
[0213] A program of an aspect of the present disclosure causes a computer to function as an image generation unit that generates, in an IR image frame, a first IR image captured in a state in which a pulse wave is on and a second IR image captured in a state in which the pulse wave is off, and an image correction unit that corrects the first IR image on the basis of the second IR image.
[0214] Thus, the IR image captured in a state in which the pulse wave is on can be corrected on the basis of the IR image captured in a state in which the pulse wave is off. As a result, it is possible to remove an influence of strong light such as the sun and to improve a recognition rate.
[0215] Note that the effects described in the present description are merely examples and are not limitations, and there may be a different effect.
[0216] Note that the present technology can also have the following configurations. [0217] (1)
[0218] An image processing device comprising:
[0219] an image generation unit that generates, in an IR image frame, a first IR image captured in a state in which a pulse wave is on and a second IR image captured in a state in which the pulse wave is off; and
[0220] an image correction unit that corrects the first IR image on a basis of the second IR image. [0221] (2)
[0222] The image processing device according to (1), wherein
[0223] the IR image frame includes a phase of generating the first IR image and a phase of generating the second IR image. [0224] (3)
[0225] The image processing device according to (1) or (2), wherein
[0226] the image correction unit removes background light and a dark component included in the first IR image on a basis of the second IR image. [0227] (4)
[0228] The image processing device according to any one of (1) to (3), wherein
[0229] the image correction unit individually adjusts exposure time of a TOF sensor in each frame. [0230] (5)
[0231] The image processing device according to (4), wherein
[0232] the image correction unit individually adjusts the exposure time of the TOF sensor in each of the IR image frame and a depth image frame. [0233] (6)
[0234] The image processing device according to (5), wherein
[0235] the image correction unit controls the exposure time in a phase included in the IR image frame to be longer than the exposure time in a phase included in the depth image frame. [0236] (7)
[0237] The image processing device according to (4), wherein
[0238] the image correction unit individually adjusts the exposure time of the TOF sensor in each of the IR image frame, a depth image frame, and an eye-gaze detection frame. [0239] (8)
[0240] The image processing device according to (7), wherein
[0241] the image correction unit performs control in such a manner that the exposure time in a phase included in the IR image frame, the exposure time in a phase included in the eye-gaze detection frame, and the exposure time in a phase included the depth image frame are lengthened in this order. [0242] (9)
[0243] The image processing device according to any one of (1) to (8), further comprising
[0244] a correction selecting unit that selects a correction method according to a positional relationship between a subject included in the first IR image and a light source. [0245] (10)
[0246] The image processing device according to (9), wherein
[0247] the correction selecting unit selects the correction method according to a distance between the subject and the light source. [0248] (11)
[0249] The image processing device according to (9) or (10), wherein
[0250] the correction selecting unit selects, for the first IR image, either a correction based on the second IR image or a correction based on a dark image stored in advance in a storage unit according to the distance between the subject and the light source. [0251] (12)
[0252] The image processing device according to any one of (9) to (11), wherein
[0253] the correction selecting unit selects, for the first IR image, the correction based on the second IR image in a case where the distance between the subject and the light source is equal to or shorter than a threshold, and selects the correction based on the dark image in a case where the distance between the subject and the light source exceeds the threshold. [0254] (13)
[0255] The image processing device according to any one of (9) to (12), wherein
[0256] the subject is a face of a person, and
[0257] the light source is sun. [0258] (14)
[0259] Electronic equipment comprising:
[0260] a TOF sensor;
[0261] an image generation unit that generates, on a basis of an output from the TOF sensor, a first IR image captured in a state in which a pulse wave is on and a second IR image captured in a state in which the pulse wave is off in an IR image frame; and
[0262] an image correction unit that corrects the first IR image on a basis of the second IR image. [0263] (15)
[0264] An image processing method comprising:
[0265] generating, in an IR image frame, a first IR image captured in a state in which a pulse wave is on and a second IR image captured in a state in which the pulse wave is off; and
[0266] correcting the first IR image on a basis of the second IR image. [0267] (16)
[0268] A program causing
[0269] a computer to function as
[0270] an image generation unit that generates, in an IR image frame, a first IR image captured in a state in which a pulse wave is on and a second IR image captured in a state in which the pulse wave is off, and
[0271] an image correction unit that corrects the first IR image on a basis of the second IR image.
REFERENCE SIGNS LIST
[0272] 1 ELECTRONIC EQUIPMENT
[0273] 10 IMAGING DEVICE
[0274] 11 LIGHT SOURCE UNIT
[0275] 12 LIGHT RECEIVING UNIT
[0276] 13 IMAGING PROCESSING UNIT
[0277] 20 IMAGE PROCESSING DEVICE
[0278] 30 EMISSION LIGHT
[0279] 31 OBJECT
[0280] 32 REFLECTED LIGHT
[0281] 210 IR IMAGE PROCESSING DEVICE
[0282] 211, 221 ACQUISITION UNIT
[0283] 212 IR IMAGE GENERATION UNIT
[0284] 213 IMAGE CORRECTION UNIT
[0285] 214 NORMALIZATION UNIT
[0286] 215 REFERENCE UNIT
[0287] 216 FIRST EXPOSURE TIME CALCULATION UNIT
[0288] 217 SECOND EXPOSURE TIME CALCULATION UNIT
[0289] 218 CORRECTION SELECTING UNIT
[0290] 220 DEPTH IMAGE PROCESSING DEVICE
[0291] 222 DEPTH CALCULATION UNIT
[0292] 230 STORAGE UNIT