IMAGING APPARATUS
20230009034 · 2023-01-12
Assignee
Inventors
- Yuta HARUSE (Shizuoka-shi, Shizuoka, JP)
- Shintaro SUGIMOTO (Shizuoka-shi, Shizuoka, JP)
- Teruaki TORII (Shizuoka-shi, Shizuoka, JP)
- Kento NITTA (Shizuoka-shi, Shizuoka, JP)
Cpc classification
B60Q1/0023
PERFORMING OPERATIONS; TRANSPORTING
G01S17/894
PHYSICS
H04N23/74
ELECTRICITY
B60Q1/249
PERFORMING OPERATIONS; TRANSPORTING
G06V10/60
PHYSICS
International classification
Abstract
The illumination device has a plurality of light-emitting pixels which are individually on/off controllable, and emits a reference light having a random intensity distribution. The photodetector detects light reflected from an object. The processing device reconstructs an image of the object OBJ, by calculating a correlation between a detection intensity b based on an output of the photodetector, and the intensity distribution I of the reference light. The plurality of light-emitting pixels are divided into the m (m≥2) areas each containing n (n≥2) adjoining light-emitting pixels. By selecting one light-emitting pixel from each of the m areas without overlapping, the n light-emitting pixel groups are determined. The imaging apparatus carries out sensing for every light-emitting pixel group.
Claims
1. An imaging apparatus comprising: an illumination device having a plurality of light-emitting pixels which are individually on/off controllable, and is structured to emit a reference light having a random intensity distribution; a photodetector structured to detect reflected light from an object; and a processing device structured to reconstruct an image of the object, by calculating a correlation between detection intensity based on an output of the photodetector, and the intensity distribution of the reference light, wherein in a case where the plurality of light-emitting pixels are divided into m (m≥2) areas each containing n (n≥2) adjoining light-emitting pixels, the imaging apparatus determines n light-emitting pixel groups by selecting one light-emitting pixel from each of the m areas without overlapping, and the imaging apparatus carries out sensing for every light-emitting pixel group.
2. The imaging apparatus according to claim 1, wherein the n light-emitting pixel groups contain light-emitting pixels at same positions in the m areas.
3. The imaging apparatus according to claim 1, wherein the n light-emitting pixel groups contain light-emitting pixels at different positions in the m areas.
4. The imaging apparatus according to claim 1, wherein the processing device synthesizes n images obtained for the n light-emitting pixel groups, to generate one image.
5. A vehicle lamp comprising the imaging apparatus described in claim 1.
6. A vehicle comprising the imaging apparatus described in claim 1.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0012] Embodiments will now be described, by way of example only, with reference to the accompanying drawings which are meant to be exemplary, not limiting, and wherein like elements are numbered alike in several Figures, in which:
[0013]
[0014]
[0015]
[0016]
[0017]
[0018]
[0019]
[0020]
[0021]
[0022]
DETAILED DESCRIPTION
Outline of Embodiments
[0023] A summary of several example embodiments of the disclosure follows. This summary is provided for the convenience of the reader to provide a basic understanding of such embodiments and does not wholly define the breadth of the disclosure. This summary is not an extensive overview of all contemplated embodiments, and is intended to neither identify key or critical elements of all embodiments nor to delineate the scope of any or all aspects. Its sole purpose is to present some concepts of one or more embodiments in a simplified form as a prelude to the more detailed description that is presented later.
[0024] This summary is not an extensive overview of all possible embodiments, and is therefore not intended to identify key elements or key elements of all embodiments, or to delineate the scope of some or all aspects. The sole purpose is to present some concepts of one or more embodiments in a simplified form, as a prelude to the more detailed description presented later.
[0025] The imaging apparatus according to one embodiment includes: an illumination device that has a plurality of light-emitting pixels which are individually on/off controllable, and emits reference light having a random intensity distribution; a photodetector that detects reflected light from an object; and a processing device that reconstructs an image of the object, by calculating a correlation between detection intensity based on an output of the photodetector, and the intensity distribution of the reference light. The plurality of light-emitting pixels are divided into the m (m≥2) areas each having n (n≥2) adjoining light-emitting pixels. By selecting one light-emitting pixel from each of the m areas without overlapping, the n light-emitting pixel groups are determined. The imaging apparatus carries out sensing for every light-emitting pixel group.
[0026] According to this embodiment in which correlation calculation for reconstructing an image is carried out for every light-emitting pixel group, the amount of processing can be reduced as compared with a case where sensing is carried out with use of all pixels, and the correlation calculation is carried out collectively. Since the light-emitting pixels contained in the individual light-emitting pixel groups are distributed over the entire sensing range, so that the plurality of light detection intensity values obtainable from the n light-emitting pixel groups are leveled. Brightness of the restored image obtainable from every light-emitting pixel group may therefore be leveled, whereby image quality of the restored image may be prevented from degrading.
[0027] In one embodiment, each light-emitting pixel group may includes light-emitting pixels which reside at the same position (coordinate point) in the m areas. This can simplify switching operation of the light-emitting pixel groups. Furthermore, when synthesizing n images obtainable from the n light-emitting pixel groups into one image, the processing therefor may be simplified.
[0028] In one embodiment, each light-emitting pixel group may includes light-emitting pixels which reside at different positions in the m areas. This would level a plurality of light detection intensity values obtainable from the n light-emitting pixel groups, and would improve the image quality.
[0029] In one embodiment, the processing device may synthesize n images obtainable from the n light-emitting pixel groups, to produce one image. This makes it possible to obtain a restored image whose resolution is equivalent to that obtainable in a case where sensing was carried out with use of all pixels at a time.
Embodiments
[0030] The present invention will be explained below on the basis of preferred embodiments, referring to the attached drawings. All similar or equivalent constituents, members and processes illustrated in the individual drawings will be given same reference numerals, so as to properly avoid redundant explanations. The embodiments are merely illustrative, and are not restrictive about the invention. All features and combinations thereof described in the embodiments are not always necessarily essential to the present invention.
[0031] The phrase “intensity distribution is random” in the present specification does not mean perfect randomness, instead allowing randomness only to an extent enough for reconstructing an image in the ghost imaging. “Random” in the present specification may therefore encompass a certain degree of regularity. Also, “random” does not require complete unpredictability, instead allowing predictability or reproducibility.
[0032]
[0033] The illumination device 110 is a pseudo-thermal light source, which generates reference light S1 having an intensity distribution I assumable to be substantially random, and casts it on an object OBJ.
[0034]
[0035] The plurality of light-emitting pixels are divided into the m (m≥2) areas each including n (n≥2) adjoining light-emitting pixels.
[0036] In this example, the illumination device 110 contains k=8×8=64 light-emitting pixels PIX, in which two adjoining light-emitting pixels PIX both in the vertical and horizontal directions, that is, n=4 light-emitting pixels PIX, constitute one area. The number m of areas is now given by k/n, which is 16.
[0037] The plurality of light-emitting pixels PIX are classified into the n light-emitting pixel groups PG1 to PGn (PG1 to PG4). Each of the light-emitting pixel groups PG1 to PG4 is formed by selecting one light-emitting pixel PIX from each of the m areas without overlapping. One light-emitting pixel group eventually contains m light-emitting pixels PIX.
[0038] In this example, each light-emitting pixel group PGi (i=1 to n) contains light-emitting pixels at the same position in the m areas. The position in this context means a relative position (coordinate point) in each area. That is, the light-emitting pixel group PG1 contains the light-emitting pixels PIX on the upper left in the m areas, the light-emitting pixel group PG2 contains the light-emitting pixels PIX on the upper right in the m areas, the light-emitting pixel group PG3 contains the light-emitting pixels PIX on the lower left in the m areas, and the light-emitting pixel group PG4 contains the light-emitting pixels PIX on the lower right in the m areas.
[0039] In this embodiment, the imaging apparatus 100 carries out sensing for each of the light-emitting pixel groups PG1 to PGn.
[0040]
[0041] Referring now to the light-emitting pixel group PGi, the reference light with the intensity distributions from I.sub.i,1 to I.sub.i,max is emitted. In I.sub.i,j, some of the m light-emitting pixels that constitute the i-th light-emitting pixel group PGi are turned ON (bright), and some of which are turned OFF (dark), thus changing the distribution of the ON pixels in a substantially random manner. The total number of times of illumination, that is, the number of patterns is given by n×max. Note that the illumination method explained with reference to
[0042] Referring now back to
[0043] Upon illumination of the reference light S1.sub.i,j (i ∈ 1 to n; j ∈ 1 to max) having n×max patterns of intensity distribution I.sub.1,1 to I.sub.1,max, I.sub.2,1 to I.sub.2,max, . . . I.sub.n,1 to I.sub.n,max the photodetector 120 outputs n×max values of the detection signal D.sub.i,j (i ∈ 1 to n; j ∈ 1 to max).
[0044] The order of illumination patterns of the reference light S1 is not particularly limited. For example, illumination may follow the order, for example, of I.sub.1,1 to I.sub.1,max, I.sub.2,1 to I.sub.2,max, I.sub.3,1 to I.sub.3,max, I.sub.n,1 to I.sub.n,max. Alternatively, illumination may follow the order, for example, of I.sub.1,1 to I.sub.n,1, I.sub.1,2 to I.sub.n,2, . . . I.sub.1,max to I.sub.n,max.
[0045] The illumination device 110 may include, for example, a light source 112 that generates light S0 having a uniform intensity distribution, and a patterning device 114 capable of spatially modulating the intensity distribution of the light S0. The light source 112 may use a laser, a light emitting diode, or the like. Wavelength or spectrum of the reference light S1 is not particularly limited, and may be white light having a plurality of spectra or a continuous spectrum, or monochromatic light that contains a predetermined wavelength (for example, near infrared or infrared radiation).
[0046] The patterning device 114 may use a digital micromirror device (DMD) or a liquid crystal device. In this embodiment, the patterning device 114 covers an entire measurement range 600, and can illuminate the entire measurement range 600 at a time.
[0047] The patterning device 114 is given a pattern signal PTN.sub.i,j (image data) that designates the intensity distribution I.sub.i,j, from the processing device 130. Hence the processing device 130 knows a position of the area currently illuminated, and the intensity distribution I.sub.i,j of the reference light S1.
[0048] The processing device 130 has a pattern generator 132 and a reconstruction processor 134.
[0049] The pattern generator 132 may randomly generate the intensity distribution I.sub.i,j of the reference light S1 each time. In this case, the pattern generator 132 may include a pseudo-random signal generator.
[0050] Alternatively, a set of intensity distribution values I.sub.1 to I.sub.max with a common pattern may be used for the plurality of light-emitting pixel groups PG1 to PGn. For example, the set of patterns I.sub.1,1 to I.sub.1,max of one light-emitting pixel group PG1 may have applied thereto a set of intensity distribution values I.sub.1 to I.sub.max without modification, meanwhile the other light-emitting pixel groups PG2 to PGn may have applied thereto a set of intensity distribution values I.sub.1 to I.sub.max after shifting them by several pixels in the horizontal direction and/or the vertical direction.
[0051] The processing device 130 may be implemented by a combination of a processor (hardware) such as central processing unit (CPU), micro processing unit (MPU), or microcomputer, with software program executed by the processor (hardware). The processing device 130 may be implemented by a combination of a plurality of processors. Alternatively, the processing device 130 may be structured solely by the hardware.
[0052] The reconstruction processor 134 reconstructs the image G.sub.i of the object OBJ, by correlating the plurality of detection intensity values b.sub.i,1 to b.sub.i,max with the intensity distribution values I.sub.i,1 to I.sub.i,max of the reference lights S1.sub.i,1 to S1.sub.i,max, for one PGi of the light-emitting pixel groups PG1 to PGn. Resolution of the image G.sub.i will be given by 1/n times the resolution of the illumination device 110.
[0053] The detection intensity values b.sub.i,1 to b.sub.i,max are derived from the detection signals D.sub.i,1 to D.sub.i,max. Relationship between the detection intensity b.sub.i,j and the detection signal D.sub.i,j may be determined, typically by taking types or schemes of the photodetector 120 into consideration.
[0054] Assuming now that the reference light S1 with a certain intensity distribution I.sub.i,j is illuminated over a certain period of illumination. Also assuming now that the detection signal D.sub.i,j represents the amount of received light at a certain point in time (or microtime), that is, an instantaneous value. In this case, the detection signal D.sub.i,j may be sampled a plurality of times during the illumination period, so as to make the detection intensity b.sub.i,j represent an integral value, an average value, or a maximum value of all sampled values of the detection signal D.sub.i,j.
[0055] Alternatively, some of all sampled values may be selected, and an integral value, an average value, or a maximum value of the selected sampled values may be used. When selecting the plurality of sampled values, for example, the x-th to y-th values serially from the maximum value may be extracted; sampled values smaller than an arbitrary threshold value may be excluded; or sampled values whose signal variation falls in a small range may be extracted.
[0056] If the photodetector 120 is a device like a camera, for which the exposure time may be preset, the output D.sub.i,j of the photodetector 120 may be used directly as the detection intensity b.sub.i,j.
[0057] Conversion from the detection signal D.sub.i,j to the detection intensity b.sub.i,j may be executed by the processing device 130, or may take place outside the processing device 130.
[0058] The image G.sub.i corresponded to the i-th (1≤i≤n) light-emitting pixel group PGi is restored with use of a correlation function represented by equation (1). I.sub.i,j represents the j-th (1≤j≤M) intensity distribution value, and b.sub.i,j represents the j-th detected intensity value.
[0059] The processing device 130 may generate one image by combining a plurality of images G.sub.1 to G.sub.n, obtained for the plurality of light-emitting pixel groups PG1 to PGn. One image has high resolution, equal to the resolution of the illumination device 110.
[0060] The structure of the imaging apparatus 100 has been described. Next, the advantage will be explained.
[0061] The number of times of calculations in the imaging apparatus 100 is as follows.
[0062] Letting now the number of pixels in the entire measurement range be X×Y, and the numbers of pixels in the horizontal and vertical directions of each light-emitting pixel group be x, y. Here, X×Y=(x×y)×n.
[0063] Assuming now that one-time pattern illumination is necessary to restore one pixel, then the number of times of illumination max′ necessary to restore x×y pixels will be
max′=(x×y).
[0064] Note that the actual number of times of illumination, which would be more or less than max′, is approximately proportional to (x×y). The number of times of calculation per area will then be (x×y).sup.2. The total number of time of calculation O′ for all the n light-emitting pixel groups PG1 to PGn will be
O′=n×(x×y).sup.2.
[0065] The number of times of calculation O by the prior method for illuminating the entire measurement range was
O=(X×Y).sup.2.
[0066] Since the relationship of X×Y=(x×y)×n holds, the number of times of calculation in the present embodiment may be reduced to O′/O=1/n times as compared with the prior method.
[0067] Assuming now a case with X=Y=1024. Given n=16, and given that the number of pixels of the light-emitting pixel group PG is x=y=256, then the number of times of calculation can be reduced to 1/16 times.
[0068] Reduction in the number of times of calculation means that the calculation time can be shortened with use of a processing device of the same processing speed. Alternatively, the processing will be done within the same time consumption, only with a slower (and thus less expensive) processing device.
[0069] Next, further advantages of the imaging apparatus 100 will be explained.
[0070] According to the area division method, as illustrated in
[0071] Referring now back to the area division method, the brightness of the obtainable image varies from area to area, so that boundaries of the areas can be seen in the synthesized image, thus degrading the image quality. This is because the brightness of the image is affected by <b.sub.i> in equation (1). More specifically, in the upper left area, only bottom-side pixels that correspond to the head of elephant, from among 32×32 pixels, are occupied by an object, thus the object occupies only a small proportion of the pixels, causing small <b.sub.i>. In contrast, in the lower left or lower right area, the proportion of the pixels that correspond to the body and legs of elephant, from among 32×32 pixels, is large, thus the object occupies a large proportion of the pixels, causing large <b.sub.i>. This is the reason why the brightness varies from area to area.
[0072] In contrast, the raster scanning method levels the proportion of the number of pixels occupied by the object, for the light-emitting pixel groups PG1 to PG4. This successfully makes the light-emitting pixel groups PG1 to PG4 have similar values of <b.sub.i>. The brightness of G.sub.1 to G.sub.4 can therefore be made uniform without causing boundaries of the areas in the synthesized image, thus improving the image quality.
[0073] Another advantage is that, in a case where each light-emitting pixel group PGi includes light-emitting pixels at the same position as illustrated in
[0074] It is to be understood by those skilled in the art that these embodiments are merely illustrative, that the individual constituents or combinations of various processes may be modified in various ways, and that also such modifications fall within the scope of the present disclosure. Such modified examples will be explained below.
Modified Example 1
[0075] Although
Modified Example 2
[0076] A sensing device to which the present invention is applicable is not only any of those having GI as an image reconstruction algorithm, but may also be any of those having DGI or other algorithm.
Modified Example 3
[0077] The illumination device 110 in the embodiment was structured by combining the light source 112 and the patterning device 114, but not restrictively. The illumination device 110 may alternatively be structured with an array of a plurality of semiconductor light sources (light emitting diodes (LEDs) or laser diodes (LDs)) arranged in a matrix, so as to make each of the semiconductor light sources on/off (luminance) controllable.
Applications
[0078] Next, applications of the imaging apparatus 100 will be explained.
[0079]
[0080] The object identification system 10 includes the imaging apparatus 100 and a processing device 40. The imaging apparatus 100, as described above, illuminates the object OBJ with the reference light S1, and detects the reflected light S2, thereby generating a restored image G of the object OBJ.
[0081] The processing device 40 processes the output image G of the imaging apparatus 100, and determines the position and type (category) of the object OBJ.
[0082] A classifier 42 of the processing device 40 receives an input of the image G, and determines the position and the type of the object OBJ contained therein. The classifier 42 is implemented according to a model generated by machine learning. An algorithm of the classifier 42 adoptable herein, although not particularly limited, includes YOLO (You Only Look Once), SSD (Single Shot MultiBox Detector), R-CNN (Region-Based Convolutional Neural Network), SPPnet (Spatial Pyramid Pooling), Faster R-CNN, DSSD (Deconvolution-SSD), and Mask R-CNN, as well as any algorithm possibly developed in the future.
[0083] The structure of the object identification system 10 has been described. The following advantages are expectable from the use of the imaging apparatus 100 as a sensor of the object identification system 10.
[0084] First, the noise resistance remarkably improves, with use of the imaging apparatus 100, that is, the quantum radar camera. For example, even when the object OBJ is poorly recognizable by naked eyes during travel in rain, in snow, or in fog, use of the imaging apparatus 100 can obtain a restored image G of the object OBJ, without being affected by rain, snow, or fog.
[0085] Second, the sensing carried out for every light-emitting pixel group PG can reduce the amount of computation. This will speed-up the frame rate, and will allow choice of an inexpensive processor for the processing device.
[0086] Note that, in the imaging apparatus 100, the number n of the light-emitting pixel groups PG may be adaptively changed on the basis of the travel environment.
[0087]
[0088]
[0089] Information on the object OBJ detected by the processing device 40 may be used for light distribution control of the vehicle lamp 200. More specifically, the lamp-borne ECU 208 generates an appropriate light distribution pattern, on the basis of the information on the type and position of the object OBJ, generated by the processing device 40. The lighting circuit 204 and the optical system 206 operate so as to obtain the light distribution pattern generated by the lamp-borne ECU 208.
[0090] The Information on the object OBJ detected by the processing device 40 may alternatively be sent to a vehicle-borne ECU 304. The vehicle-borne ECU may carry out automatic driving, on the basis of the information.
[0091] The embodiments merely illustrate an aspect of the principle and applications of the present invention, allowing a variety of modifications and layout changes without departing from the spirit of the present invention specified by the claims.