STEREO IMAGING DEVICE
20200099914 ยท 2020-03-26
Assignee
Inventors
Cpc classification
H04N25/135
ELECTRICITY
H04N23/665
ELECTRICITY
H04N25/133
ELECTRICITY
H04N13/239
ELECTRICITY
H04N2013/0081
ELECTRICITY
International classification
Abstract
A stereo imaging device includes a first image sensor and a second image sensor each capable of outputting a captured image signal. A monitoring signal and a parallax detection signal generating unit and a parallax detection signal generating unit are operated to generate two parallax detection signals for detecting a parallax from the two image signals of the two image sensors also generate, from an image signal fed from one of the two image sensors, a monitoring signal to be outputted to the monitor. The first reduction processing circuit reduces and outputs a monitoring signal at a preset reducing rate. The second reduction processing circuit performs a conversion on an arbitrary range of an image indicated by the parallax detection signal, so as to form an arbitrary reduction ratio, thus outputting a parallax detection signal.
Claims
1. A stereo imaging device that outputs image signals captured by a stereo camera to a monitoring monitor and outputs the image signals to an image recognition unit that generates at least a distance image from a parallax of the image signals, the stereo imaging device comprising: two image sensors for outputting the image signals; parallax detection signal generating unit for generating two parallax detection signals for detecting a parallax from the two image signals; monitoring signal generating unit for generating a monitoring signal to be outputted to the monitor, from an image signal fed from one of the two image sensors; parallax detection signal reducing unit for reducing and outputting the parallax detection signals; and monitoring signal reducing unit for reducing and outputting the monitoring signal.
2. The stereo imaging device according to claim 1, wherein the parallax detection signal generating unit and the monitoring signal generating unit include two or more line memories, and perform a synchronization processing on the monitoring signal, and perform a smoothing processing on the parallax detection signals by using the line memories.
3. The stereo imaging device according to claim 1, wherein each of the parallax detection signal reducing unit and the monitoring signal reducing unit includes a plurality of line memories, and performs sub-sampling using the line memories.
4. The stereo imaging device according to claim 1, wherein the parallax detection signal generating unit and the monitoring signal generating unit include frame memory, and perform a synchronization processing on the monitoring signal, and perform a smoothing processing on the parallax detection signals by using the frame memory.
5. The stereo imaging device according to claim 1, wherein each of the parallax detection signal reducing unit and the monitoring signal reducing unit includes frame memory, and reduces the parallax detection signals and the monitoring signal by using the frame memory.
6. The stereo imaging device according to claim 1, wherein the parallax detection signal reducing unit is configured to cut out an image represented by a parallax detection signal, in a manner such that the image becomes smaller than its original size, thereby outputting the parallax detection signal in which the pixels of the image have been reduced.
7. The stereo imaging device according to claim 2, wherein each of the parallax detection signal reducing unit and the monitoring signal reducing unit includes a plurality of line memories, and performs sub-sampling using the line memories.
8. The stereo imaging device according to claim 2, wherein each of the parallax detection signal reducing unit and the monitoring signal reducing unit includes frame memory, and reduces the parallax detection signals and the monitoring signal by using the frame memory.
9. The stereo imaging device according claim 2, wherein the parallax detection signal reducing unit is configured to cut out an image represented by a parallax detection signal, in a manner such that the image becomes smaller than its original size, thereby outputting the parallax detection signal in which the pixels of the image have been reduced.
10. The stereo imaging device according to claim 3, wherein the parallax detection signal reducing unit is configured to cut out an image represented by a parallax detection signal, in a manner such that the image becomes smaller than its original size, thereby outputting the parallax detection signal in which the pixels of the image have been reduced.
11. The stereo imaging device according to claim 4, wherein the parallax detection signal reducing unit is configured to cut out an image represented by a parallax detection signal, in a manner such that the image becomes smaller than its original size, thereby outputting the parallax detection signal in which the pixels of the image have been reduced.
12. The stereo imaging device according to claim 5, wherein the parallax detection signal reducing unit is configured to cut out an image represented by a parallax detection signal, in a manner such that the image becomes smaller than its original size, thereby outputting the parallax detection signal in which the pixels of the image have been reduced.
13. The stereo imaging device according to claim 7, wherein the parallax detection signal reducing unit is configured to cut out an image represented by a parallax detection signal, in a manner such that the image becomes smaller than its original size, thereby outputting the parallax detection signal in which the pixels of the image have been reduced.
14. The stereo imaging device according to claim 8, wherein the parallax detection signal reducing unit is configured to cut out an image represented by a parallax detection signal, in a manner such that the image becomes smaller than its original size, thereby outputting the parallax detection signal in which the pixels of the image have been reduced.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0029]
[0030]
[0031]
[0032]
[0033]
[0034]
[0035]
[0036]
[0037]
[0038]
[0039]
[0040]
[0041]
[0042]
[0043]
[0044]
[0045]
[0046]
DESCRIPTION OF EMBODIMENTS
[0047] Hereinafter, embodiments of the present invention will be described in detail.
[0048] The stereo imaging device according to the present embodiment uses, for example, a stereo camera as a camera mainly for monitoring, such as a monitoring camera and an in-vehicle camera. The stereo imaging device is not for outputting a stereoscopic image, but for outputting an image for use in monitoring and for outputting two images for use in detecting a parallax. In the stereo imaging device of the present embodiment, what is used is, for example, one of two images of a stereo camera for monitoring, thereby outputting two-dimensional color image. Also, the two images are outputted as parallax detection images in grayscale. The parallax detection image is an image for calculating a distance image indicating a distance between the pixels, by obtaining a parallax between the two images.
[0049] As shown in
[0050] The first and second image sensors 1, 2 and the synchronization unit 3 together constitute a stereo camera, thus outputting a pair of image (moving image) data synchronized on the left and right. Further, the first image sensor 1 and the second image sensor 2 of the stereo camera are capable of photographing with a visible light and photographing with a near infrared light, and may be image sensors using a double bandpass filter (DBPF) in place of an infrared cutting filter which is usually employed in a normal camera. However, it is also possible to use an infrared cutting filter to capture only a visible light.
[0051] The first and second image sensors 1, 2 include DBPFs and color filters, serving as cameras for capturing a visible image and an infrared image. The DBPF is configured to transmit a light in a visible light band and a light in a near infrared light band. The color filter is formed by making a mosaic pattern on a white W pixel area which transmits substantially all of infrared IR lights and visible lights, in addition to pixel areas of red R, green G and blue B.
[0052] The DBPF is an optical filter having a transmission characteristic in a visible light band, and having a blocking characteristic in the first wavelength band adjacent to the long wavelength side of the visible light band, and also having a transmission characteristic in a second wavelength band that is a part of the first wavelength band. A wavelength band (a part of the first wavelength band) between the visible light band and the second wavelength band has a blocking characteristic with respect to light. Since the stereo camera of the present embodiment does not use an infrared cutting filter which is usually employed in a normal camera, it is possible for DBPF to transmit an infrared light (second wavelength band), and also possible for a color filter to transmit a light in white W pixel region. In such case, the infrared light not only transmits through the white W pixel region of a dichroic color filter, but also transmits through the R, G, and B pixel regions. That is, the color filter has a characteristic of transmitting an infrared light, while an ordinary camera uses an infrared cutting filter to eliminate the influence from an infrared light.
[0053] In this embodiment, for example, a visible light image and an infrared light image can be finally obtained by calculation. On the other hand, the above-described white W pixel region of the color filter is not a white region, but is a colorless and transparent region that transmits visible light and infrared light.
[0054]
[0055] A signal corresponding to the color filter is outputted from the first and second image sensors 1, 2. The output image signal is outputted from the first image sensor 1 to the monitoring signal and parallax detection signal generating unit 4 and sent from the second image sensor 2 to the parallax detection signal generating unit 5.
[0056] As shown in
[0057] In the monitoring signal generation, for example, color/luminance processing (color/luminance processing circuit 13) for converting RGB signals into luminance and color difference (for example, Y, Cb, Cr) signals is performed, followed by the first reduction processing (using the first reduction processing circuit 14 (monitoring signal reduction unit)). In addition, since the parallax detection signal generation does not need colors in calculating a parallax, parallax detection signal processing (parallax detection signal processing circuit 15) is performed which generates, as parallax detection signal, so-called grayscale (luminance signal) image signal obtained by smoothing RGBW using line memory without synchronization, followed by the second reduction processing (second reduction processing circuit 16 (parallax detection signal reduction unit)).
[0058] As shown in
[0059] As shown in
[0060] Namely, as shown in
[0061] On the other hand, in the first line memory 21, since the output of the above one line is first stored and then outputted, the output will be delayed by one line with respect to the through output signal. Further, in the second line memory 22, since the output of one line of the first line memory 21 is first stored and then outputted, the output will be delayed by two lines with respect to the through signal. When this is considered with reference to the output of the first line memory 21, the through output signal will be earlier by one line than the output of the first line memory 21, and the output of the second line memory 22 will be delayed by one line.
[0062] Also, the output of one line is only the outputs of white W and red R, or only the outputs of green G and blue B. In this case, when the output of the first line memory 21 is used as a reference, if the output of the first line memory 21 is green G and blue B, the through output signal and the output of the second line memory 22 will be white W and red R. On the other hand, when the output of the first line memory 21 is white W and red R, the through output signal and the output of the second line memory 22 will be green G and blue B. Therefore, an output signal of only white W and red R and an output signal of only green G and blue B are outputted simultaneously, by combining the output of the first line memory 21 with the outputs of the through output signal and second line memories 22.
[0063] As shown in
[0064] For each output of one line, the outputs of the first line memory 21, the second line memory 22 and the through output signal are changed-over between the output of white W and red R and the output of green G and blue B. Meanwhile, with respect to the output of the first line memory 21, the outputs of the through output signal and second line memory 22 are always reversed between the output of white W and red R and the output of green G and blue B.
[0065] Therefore, the output of the first line memory 21 is changed-over between the output of white W and red R and the output of green G and blue B, while the output of the addition processing unit 24 is changed-over between the output of green G and blue B and the output of white W and red R. Accordingly, using the line changeover switches 25, 26, a change-over can be performed between the output of the first line memory 21 and the output of the addition processing unit 24, with one terminal constantly outputting white W and red R signals during photographing, and the other terminal constantly outputting green G and blue B signals during photographing, thereby ensuring that the vertical synchronization process has been performed.
[0066] WRWRWR . . . signals and BGBGBG . . . signals generated as described above are sent to the horizontal synchronization circuits 32, 33. Next, in the horizontal synchronization circuit 32 shown in
[0067] Here, description will be given to the horizontal synchronization processing on the outputs of white W and red R. At first, the through output signal is a signal in which white W and red R are repeated. The output of the first register 41 stores the output of one pixel of the through output signal and then outputs the same, and the output will be delayed by one pixel with respect to the output of through output signal. The output of the second register 42 is produced after the output of the first register 41 is stored for one pixel, and is delayed by one pixel with respect to the output of the first register 41.
[0068] Therefore, with respect to the output of the first register 41, the through output signal is accelerated by one pixel, and the output of the second register 42 is delayed by one pixel. Here, when the outputs of white W and red R are changed-over for each pixel, if the output of the first register 41 is white W, the outputs of the through output signal and second register 42 will be red R. On the other hand, if the output of the first register 41 is red R, the outputs of the through output signal and the second register 42 will become white W. Accordingly, by combining the output of the first register 41 with the outputs of the through output signal and second register 42, it is possible to obtain both outputs of white W and red R for one pixel. Here, of the through output signal and of the signal output of the second register 42 are added in the pixel addition processing unit 43 and then outputted. This output is an average of an output of a pixel immediately before a pixel having the output of the first register 41 and an output of a pixel immediately after the pixel having the output of the first register 41.
[0069] The output of the first register 41 is changed over between white W and red R for each pixel, and the output of the pixel addition processing unit 43 is changed over between red R and white W for each pixel. Using pixel changeover switches 44, 45, changeover is performed between the output of the first register 41 and the output of the pixel addition processing unit 43 for each pixel. In this way, white W is constantly outputted from the white W terminal during photographing, while red R is constantly outputted from the red R terminal during photographing. Further, Green G and blue B are similarly processed. As a result, the output from the color synchronization processing circuit 12 will produce four signals of white W, red R, green G, and blue B for each pixel, resulting in four images of red, green, blue, and white.
[0070] RGBW signals processed in the synchronization are sent to the color/luminance processing circuit 13 shown in
[0071]
[0072] In addition, signals of three different phases outputted from the line memory unit 11 are sent to the parallax detection signal processing circuit 15 and converted into parallax detection signals for use in parallax detection. Since the parallax detection signal processing circuit 15 does not require colors, the above signals are smoothed and converted to a gray scale (luminance signal). In the parallax detection signal processing circuit 15, a vertical filtering processing (vertical filtering processing circuit 61) is performed. As shown in
[0073] Next, as shown in
[0074] The monitoring signal and the parallax detection signal are outputted in a decimation-processed and reduced state. In this embodiment, as shown in
[0075] In the first reduction processing circuit 14 for the monitoring signal, a signal converted into the luminance signal described above is inputted. In fact, the first reduction processing circuit 14 includes a circuit in which the luminance is decimated and another circuit in which a signal converted into a color difference signal is inputted and the color difference is then decimated. For example, luminance signals and color difference signals are subsampled and reduced when they are inputted into line memory, stored there and outputted therefrom. In the first reduction processing circuit 14, particularly in the horizontal/vertical sub-sampling circuit 71 having line memory, for example, sub-sampling is performed that halves the number of samples N both in the horizontal direction and in the vertical direction, thereby reducing the number of samples. Then, with the number of samples reduced, the number of samples is stored in the FIFO circuit 72 and is slowly outputted from the FIFO circuit 72. During the subsampling, for example, the number of samples per line is reduced, and the number of lines is also reduced, and subsampling is performed in both the horizontal and vertical directions.
[0076] In addition, as shown in
[0077] As shown in
[0078] Here, image area is reduced by reducing the number of vertical and horizontal pixels by at the time of reduction. However, in the mode 2 shown in
[0079] As a result, the image becomes smaller and the analysis range also becomes smaller, but the resolution becomes as high as that before cutting out. For example, it is possible to increase the precision of image recognition.
[0080] In such a stereo imaging device, as shown in
[0081] The cut-out image is not reduced and has a high definition, but can be processed in the same manner as the reduced image signal C by cutting the image size to make it smaller. In the present embodiment, since the processing after the line memory unit 11 is divided into monitoring and parallax detection, even if the magnification of the image is changed or the frame rate is changed on the parallax detection side, the display magnification and frame rate of the monitor do not change, and monitoring is thus not affected. In other words, the current situation can be monitored in real time, and a specific target person such as a criminal can be detected by high-precision image recognition using the parallax of the stereo camera. On the other hand, in the parallax detection signal, the size of the cutout range maybe arbitrarily set, or may be selected from a plurality of preset sizes. Moreover, the reduction ratios of the parallax detection signal and the monitoring signal may be fixed, changed, or may be selected from a plurality of reduction ratios. However, when the reduction ratio is to be fixed, it is necessary for the parallax detection signal to be processed in one of the two ways, i.e., either being reduced or being finely cutout without being reduced.
[0082] In the above description, the synchronization process of the monitoring signal, the smoothing process of the parallax detection signal, and the decimation of the monitoring signal and the parallax detection signal are performed using the line memory. On the other hand, it is also possible to use frame memory instead of line memory in the above processes. In this case, the frame memory may be used for one frame or a plurality of frames. Using the frame memory it is possible to store the data of all the pixels of the frame. For example, in each pixel, it is possible to perform an interpolation processing or a smoothing process using the data of the surrounding pixels, and it is also possible to perform sub-sampling in the pixels arranged in vertical and horizontal directions. By using the frame memory, it is possible to store the values of all pixels for one frame of the image signals outputted from the image sensors 1, 2. Therefore, any known method can be used for interpolation, smoothing, and sub-sampling, thus increasing the degree of freedom in designing a stereo camera.
[0083]
EXPLANATION OF REFERENCE NUMERALS
[0084] 1 first image sensor [0085] 2 second image sensor [0086] 4. unit for generating monitoring signal and parallax detection signal [0087] 5. parallax detection signal generating unit [0088] 11 line memory unit [0089] 14 first reduction processing circuit [0090] 16 second reduction processing circuit [0091] 21 first line memory [0092] 22 second line memory [0093] 71 horizontal/vertical sub-sampling circuit (line memory)