IMAGING DEVICE, IMAGING SYSTEM, AND IMAGING METHOD
20220390383 · 2022-12-08
Inventors
Cpc classification
G01N21/8851
PHYSICS
H04N23/70
ELECTRICITY
G03B15/05
PHYSICS
H04N23/00
ELECTRICITY
H04N25/77
ELECTRICITY
G03B11/00
PHYSICS
G06V10/28
PHYSICS
International classification
G03B11/00
PHYSICS
Abstract
Provided is an imaging device including: an imaging unit (130) that generates a one frame image by sequentially receiving each reflected light reflected by a subject by intermittently and sequentially irradiating the subject with each irradiation light having a different wavelength according to a position of the moving subject, temporarily and sequentially holding signal information based on the reflected light of each wavelength, and collectively reading the held signal information; and a combining unit (140) that generates a combined image by cutting a subject image corresponding to the reflected light of each wavelength from the one frame image and superimposing a plurality of the cut subject images.
Claims
1. An imaging device comprising: an imaging unit that generates a one frame image by sequentially receiving each reflected light reflected by a subject by intermittently and sequentially irradiating the subject with each irradiation light having a different wavelength according to a position of the moving subject, temporarily and sequentially holding signal information based on the reflected light of each wavelength, and collectively reading the held signal information; and a combining unit that generates a combined image by cutting a subject image corresponding to the reflected light of each wavelength from the one frame image and superimposing a plurality of the cut subject images.
2. The imaging device according to claim 1, wherein the imaging unit has a plurality of pixels, and each of the pixels includes an imaging element that receives the reflected light to generate the signal information, and a memory unit that temporarily holds the signal information from the imaging element.
3. The imaging device according to claim 2, wherein the imaging unit operates in a global shutter system that collectively reads the signal information held in the each memory unit.
4. The imaging device according to claim 2, wherein the pixel includes an InGaAs imaging element that detects near infrared light.
5. The imaging device according to claim 2, wherein the combining unit cuts each subject image corresponding to the reflected light of each wavelength by cutting a plurality of predetermined regions designated in advance from the one frame image.
6. The imaging device according to claim 2, wherein the combining unit has an imaging region specifying unit that specifies each subject image corresponding to the reflected light of each wavelength in the one frame image.
7. The imaging device according to claim 6, wherein the combining unit further has a binarization processing unit that converts the one frame image into a two-step color tone to generate a two-step color tone image, and the imaging region specifying unit specifies each subject image corresponding to the reflected light of each wavelength, on the basis of the two-step color tone image.
8. The imaging device according to claim 6, wherein the imaging unit sequentially receives each reference light reflected by the subject by intermittently and sequentially irradiating, with the each irradiation light, the subject moving at a constant speed along a predetermined direction before and after irradiation with the each irradiation light, the imaging unit generates the one frame image including a subject image corresponding to the reference light by temporarily and sequentially holding the signal information based on the reference light and collectively reading the held signal information, and the imaging region specifying unit specifies a subject image corresponding to the reflected light of each wavelength located between subject images corresponding to the two reference lights, on the basis of the subject images corresponding to the two reference lights.
9. The imaging device according to claim 2, wherein the combining unit has a combining processing unit that calculates a color parameter of each of the pixels in the subject image corresponding to the reflected light of each wavelength, on the basis of color information set in advance to correspond to the each wavelength and signal information of each of the pixels in the subject image corresponding to the reflected light of each wavelength, calculates an addition average of the color parameters in the plurality of subject images for each of the pixels, and generates a color image as the combined image on the basis of the calculated addition average.
10. The imaging device according to claim 1, wherein the imaging unit has a plurality of filters that are provided to face the subject, are sequentially arranged along a moving direction of the subject, and transmit light of different wavelengths.
11. The imaging device according to claim 10, wherein the plurality of filters are an on-chip color filter or a plasmon filter.
12. The imaging device according to claim 1, further comprising: an irradiation unit that intermittently and sequentially irradiates the subject with the irradiation light having a different wavelength according to a position of the moving subject.
13. The imaging device according to claim 12, wherein the irradiation unit has a plurality of light emitting elements that emit light of different wavelengths.
14. The imaging device according to claim 13, wherein the plurality of light emitting elements include a plurality of light emitting diodes that emit near infrared light.
15. The imaging device according to claim 13, wherein the plurality of light emitting elements include a reference light emitting element that emits reference light having a predetermined wavelength other than near infrared light.
16. The imaging device according to claim 15, wherein the reference light emitting element emits visible light as the reference light.
17. The imaging device according to claim 12, further comprising: a control unit that controls the imaging unit such that the imaging unit receives the reflected light in synchronization with irradiation by the irradiation unit.
18. An imaging device comprising: an imaging unit that generates a one frame image by sequentially receiving each reflected light reflected by a subject by intermittently and sequentially irradiating the subject with each irradiation light having a different wavelength according to a position of the moving subject, temporarily and sequentially holding signal information based on the reflected light of each wavelength, and collectively reading a part of the signal information corresponding to a plurality of predetermined regions designated in advance in the held signal information; and a combining unit that generates a combined image by cutting a subject image corresponding to the reflected light of each wavelength from the one frame image and superimposing a plurality of the cut subject images.
19. An imaging system comprising: a moving device that moves a subject; an irradiation device that intermittently and sequentially irradiates the subject with irradiation light having a different wavelength according to a position of the moving subject; an imaging apparatus that generates a one frame image by sequentially receiving each reflected light reflected by the subject by the irradiation, temporarily and sequentially holding signal information based on the reflected light of each wavelength, and collectively reading the held signal information; and a combining device that generates a combined image by cutting a subject image corresponding to the reflected light of each wavelength from the one frame image and superimposing a plurality of the cut subject images.
20. An imaging method comprising: generating a one frame image by sequentially receiving each reflected light reflected by a subject by intermittently and sequentially irradiating the subject with each irradiation light having a different wavelength according to a position of the moving subject, temporarily and sequentially holding signal information based on the reflected light of each wavelength, and collectively reading the held signal information; and generating a combined image by cutting a subject image corresponding to the reflected light of each wavelength from the one frame image and superimposing a plurality of the cut subject images.
Description
BRIEF DESCRIPTION OF DRAWINGS
[0010]
[0011]
[0012]
[0013]
[0014]
[0015]
[0016]
[0017]
[0018]
[0019]
[0020]
[0021]
[0022]
[0023]
[0024]
[0025]
[0026]
[0027]
[0028]
[0029]
[0030]
DESCRIPTION OF EMBODIMENTS
[0031] Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the accompanying drawings. Note that, in the present specification and the drawings, redundant description of components having substantially the same functional configuration is omitted by assigning the same reference numerals. Further, in the present specification and the drawings, similar components of different embodiments may be distinguished by adding different alphabets after the same reference numerals. However, when it is unnecessary to particularly distinguish each of the similar components, only the same reference numeral is assigned.
[0032] Note that the description will be given in the following order.
[0033] 1. Background of embodiments of present disclosure created by present inventors
[0034] 2. First Embodiment
[0035] 2.1 Outline of imaging system
[0036] 2.2 Detailed configuration of imaging module
[0037] 2.3 Imaging method
[0038] 2.4 First modification
[0039] 2.5 Second modification
[0040] 3. Second Embodiment
[0041] 3.1 Detailed configuration of imaging module
[0042] 3.2 Imaging method
[0043] 4. Third Embodiment
[0044] 4.1 Detailed configuration of imaging module
[0045] 4.2 Imaging method
[0046] 5. Fourth Embodiment
[0047] 5.1 Detailed configuration of imaging module
[0048] 5.2 Imaging method
[0049] 6. Fifth Embodiment
[0050] 6.1 Detailed configuration of imaging module
[0051] 6.2 Imaging method
[0052] 7. Sixth Embodiment
[0053] 8. Application to mobile object
[0054] 9. Summary
[0055] 10. Supplement
[0056] An embodiment described below will be described as being applied to an inspection device that inspects, in a manufacturing line installed at a manufacturing site or the like, the presence or absence of scratches, the presence or absence of mixing of a foreign material, and whether or not an appearance of a manufactured product is an acceptable product suitable for shipment, on the basis of an image of the appearance of the product. However, the present embodiment is not limited to being applied to the inspection device, and may be applied to other devices or other purposes.
[0057] Further, the embodiment described below will be described as being applied to an imaging module operating in a global shutter system. Note that, in the following description, the global shutter system means a system for collectively reading imaging signals (signal information) obtained by respective imaging elements of the imaging module and generating a one frame image on the basis of the read imaging signals. However, the present embodiment is not limited to being applied to the imaging module of the global shutter system, and may be applied to imaging modules of other systems.
[0058] Furthermore, in the following description, since one frame means one reading, the one frame image is an image generated by performing collective reading of the imaging signals once.
1. Background of Embodiments of Present Disclosure Created by Present Inventors
[0059] As described above, in order to more accurately capture the appearance of the product, for example, it has been proposed to combine a plurality of spectral images, which are obtained by finely dividing a wavelength of light related to a subject image into a plurality of wavelengths and detecting the wavelength, into one image and use the combined image.
[0060] Specifically, as an example of a method for obtaining a spectral image, the imaging module uses optical components such as a diffraction grating and a mirror to disperse light in a vertical direction for one horizontal line and detect the light. In addition, the subject or the imaging module is moved (scanned) in a horizontal direction at a constant speed to disperse and detect the light as described above, thereby acquiring a two-dimensional image for each wavelength of the light.
[0061] Further, in Patent Literature 1 described above, strobe light having different wavelengths is continuously emitted to a subject, and reflected light from the spatially separated subject is incident on different positions on a light receiving surface on which a plurality of imaging elements of an imaging module are arranged, thereby detecting the light. However, in Patent Literature 1, as described above, a large number of optical components such as a diffraction grating and a mirror are required to spatially separate the reflected light, and it is difficult to avoid a complicated configuration and an increase in manufacturing cost of the imaging module.
[0062] Further, in Patent Literature 2 described above, an image for each wavelength is detected by switching the wavelength of light emitted from a light source for each frame. Specifically, in Patent Literature 2 described above, in a case where images of three different wavelengths are to be obtained, it takes an imaging time for three frames. Therefore, in Patent Literature 2 described above, it is difficult to suppress an increase in processing time for obtaining an image, and a real-time property is poor.
[0063] Accordingly, in view of such a situation, embodiments of the present disclosure that can obtain a combined image of spectral images with a simple configuration and at high speed have been created. Specifically, according to the embodiments of the present disclosure, a large number of optical components such as a diffraction grating and a mirror are not required, it is possible to avoid a complicated configuration and an increase in manufacturing cost of the imaging module and furthermore, it is possible to generate an image obtained by combining a plurality of spectral images into one image in an imaging time for one frame. Hereinafter, details of the embodiments of the present disclosure created by the present inventors will be sequentially described.
2. First Embodiment
[0064] <2.1 Outline of Imaging System>
[0065] First, a configuration of an imaging system 10 according to an embodiment of the present disclosure will be described with reference to
[0066] (Imaging Module 100)
[0067] The imaging module 100 irradiates the subject 800 with light, receives reflected light from the subject 800, generates a one frame image, and generates a combined image from the one frame image. A detailed configuration of the imaging module 100 will be described later.
[0068] (Control Server 200)
[0069] The control server 200 can control the imaging module 100 and furthermore, can monitor or control a traveling speed of the belt conveyor 300 described later, a position of the subject 800 on the belt conveyor 300, and the like. The control server 200 is realized by hardware such as a central processing unit (CPU), a read only memory (ROM), and a random access memory (RAM).
[0070] (Belt Conveyor 300)
[0071] The belt conveyor 300 is a moving device capable of moving the subject 800 under the control of the control server 200. Alternatively, the belt conveyor 300 is a moving device whose traveling speed is monitored by the control server 200 and which can move the subject 800. Note that, in the present embodiment, the moving device is not limited to the belt conveyor 300, and is not particularly limited as long as the moving device can move the subject 800.
[0072] Note that the devices in the imaging system 10 according to the present embodiment are communicably connected to each other via a network (not illustrated in the drawings). Specifically, for example, the imaging module 100, the control server 200, and the belt conveyor 300 can be connected to the network via a base station or the like (for example, a base station of a mobile phone, an access point of a wireless local area network (LAN), and the like) not illustrated in the drawings. Note that, as a communication system used in the network, any system can be applied regardless of wired or wireless (for example, WiFi (registered trademark), Bluetooth (registered trademark), and the like), but it is desirable to use a communication system capable of maintaining a stable operation.
[0073] <2.2 Detailed Configuration of Imaging Module>
[0074] Next, a configuration of the imaging module 100 according to the present embodiment will be described with reference to
[0075] (Irradiation Unit 110)
[0076] The irradiation unit 110 can intermittently and sequentially irradiate the subject 800 with irradiation light having different wavelengths (for example, wavelengths λ.sub.1 to λ.sub.7) according to positions of the moving subject 800 (pulse irradiation). Specifically, as illustrated in
[0077] (Imaging Device 120)
[0078] As illustrated in
[0079] —Imaging Unit 130—
[0080] The imaging unit 130 can sequentially receive the reflected light having the respective wavelengths (for example, the wavelengths λ.sub.1 to λ.sub.7) reflected by the moving subject 800. Further, the imaging unit 130 can generate a one frame image by temporarily and sequentially holding each imaging signal (signal information) based on the reception of the reflected light of each wavelength and then collectively reading the held imaging signals. Specifically, the imaging unit 130 has an optical system mechanism (not illustrated in the drawings) including a lens unit 132, a diaphragm mechanism (not illustrated in the drawings), a zoom lens (not illustrated in the drawings), a focus lens (not illustrated in the drawings), and the like. Furthermore, the imaging unit 130 has a plurality of imaging elements 134 that photoelectrically convert the light obtained by the optical system mechanism to generate imaging signals, a plurality of memory units 136 that temporarily hold the generated imaging signals, and a reading unit 138 that collectively reads the imaging signals from the plurality of memory units 136. Note that, although one imaging element 134 and one memory unit 136 are illustrated in
[0081] More specifically, the optical system mechanism uses the above-described lens unit 132 or the like to condense the reflected light from the subject 800 on the plurality of imaging elements 134 as an optical image. For example, the imaging element 134 can be a compound sensor such as an InGaAs photodiode (InGaAs imaging element) capable of detecting near infrared light, or can be a silicon photodiode capable of detecting visible light. In addition, the plurality of imaging elements 134 are arranged in a matrix on a light receiving surface (surface on which an image is formed), and each imaging element 134 photoelectrically converts the formed optical image in units of pixels (units of imaging elements) to generate a signal of each pixel as an imaging signal. In addition, the plurality of imaging elements 134 output the generated imaging signals to the memory units 136 provided in units of pixels, for example. The memory units 136 can temporarily hold the output imaging signals. Furthermore, the reading unit 138 can output a one frame image to the combining unit 140 by collectively reading the imaging signals from the plurality of memory units 136. That is, in the present embodiment, the imaging unit 130 can operate in a global shutter system that collectively reads the imaging signals held in the respective memory units 136.
[0082] Further, in the present embodiment, for example, with the arrival of the subject 800 at the imaging position by the belt conveyor 300 as a trigger, the irradiation by the irradiation unit 110 described above and imaging of the global shutter system (multiple exposure) by the imaging unit 130 are performed in synchronization with each other.
[0083] Furthermore, a planar configuration example of the plurality of imaging elements 134 described above will be described below with reference to
[0084] The pixel array unit 410 has a plurality of imaging elements (pixels) 134 two-dimensionally disposed in a matrix on the semiconductor substrate 500. Further, the plurality of pixels 134 may include normal pixels for generating pixel signals for image generation and a pair of phase difference detection pixels for generating pixel signals for focus detection. Each of the pixels 134 has a plurality of InGaAs imaging elements (photoelectric conversion elements) and a plurality of pixel transistors (for example, metal-oxide-semiconductor (MOS) transistors) (not illustrated in the drawings). More specifically, the pixel transistors can include a transfer transistor, a selection transistor, a reset transistor, and an amplification transistor, for example.
[0085] The vertical drive circuit unit 432 includes a shift register, for example, selects a pixel drive wire 442, supplies a pulse for driving the pixels 134 to the selected pixel drive wire 442, and drives the pixels 134 in units of rows. That is, the vertical drive circuit unit 432 selectively scans each of the pixels 134 of the pixel array unit 410 in a vertical direction (up-down direction in
[0086] The column signal processing circuit unit 434 is disposed for each column of the pixels 134, and performs signal processing such as noise removal for each pixel column on the pixel signals output from the pixels 134 for one row. For example, the column signal processing circuit unit 434 can perform signal processing such as correlated double sampling (CDS) and analog-digital (AD) conversion in order to remove pixel-specific fixed pattern noise.
[0087] The horizontal drive circuit unit 436 includes a shift register, for example, sequentially outputs horizontal scanning pulses to sequentially select each of the column signal processing circuit units 434 described above, and can output the pixel signal from each of the column signal processing circuit units 434 to a horizontal signal line 446.
[0088] The output circuit unit 438 can perform signal processing on the pixel signals sequentially supplied from each of the column signal processing circuit units 434 described above through the horizontal signal line 446, and can output the signals. The output circuit unit 438 may function as a functional unit that performs buffering, for example, or may perform processing such as black level adjustment, column variation correction, and various digital signal processing. Note that the buffering means temporarily storing the pixel signals in order to compensate for differences in processing speed and transfer speed in exchanging the pixel signals. Further, an input/output terminal 448 is a terminal for exchanging signals with an external device, and is not necessarily provided in the present embodiment.
[0089] The control circuit unit 440 can receive an input clock and data for giving an instruction on an operation mode or the like, and can output data such as internal information of the pixel 134. That is, the control circuit unit 440 generates a clock signal or a control signal to be a reference of the operation of the vertical drive circuit unit 432, the column signal processing circuit unit 434, the horizontal drive circuit unit 436, or the like, on the basis of a vertical synchronization signal, a horizontal synchronization signal, and a master clock. Then, the control circuit unit 440 outputs the generated clock signal or control signal to the vertical drive circuit unit 432, the column signal processing circuit unit 434, the horizontal drive circuit unit 436, or the like.
[0090] Note that the planar configuration example of the imaging element 134 according to the present embodiment is not limited to the example illustrated in
[0091] —Combining Unit 140—
[0092] The combining unit 140 cuts subject images corresponding to the reflected light of the respective wavelengths (for example, the wavelengths λ.sub.1 to λ.sub.7) from the one frame image output from the imaging unit 130, and superimposes a plurality of cut subject images to generate a combined image. The combining unit 140 is realized by, for example, hardware such as a CPU, a ROM, and a RAM. Specifically, as illustrated in
[0093] The binarization processing unit 142 can generate a two-step color tone image (for example, a black-and-white image) by performing binarization processing of converting the one frame image output from the imaging unit 130 into a two-step color tone. For example, the binarization processing unit 142 can generate a black-and-white image by comparing an imaging signal of each pixel unit (specifically, each pixel) in a one frame image with shading with a predetermined threshold, and converting a pixel unit having an imaging signal in one range into white and a pixel unit having an imaging signal in the other range into black, on the basis of the threshold. In the present embodiment, as described above, the binarization processing is performed on the one frame image with shading to convert the image into the black-and-white image. As a result, the contour of imaging of the subject 800 in the one frame image is clarified, and the subject image described later can be easily and accurately specified.
[0094] The imaging region specifying unit 144 specifies each subject image (for example, a region of interest (ROI)) corresponding to the reflected light of each of the wavelengths (for example, the wavelengths λ.sub.1 to λ.sub.7) in the one frame image. Specifically, for example, the imaging region specifying unit 144 can specify the center coordinates (for example, X and Y coordinates) of imaging of each subject 800 in the one frame image by detecting the contour of imaging of each subject 800 included in the two-step color tone image generated by the binarization processing unit 142. Further, the imaging region specifying unit 144 can specify the ROI, which is a region of imaging of the subject 800 corresponding to the reflected light of each wavelength, on the basis of the specified center coordinates. For example, the imaging region specifying unit 144 can specify each ROI in a one frame image by superimposing the center of a rectangular extraction frame having a preset size capable of including imaging of the subject 800 on the specified center coordinates. Note that, in the present embodiment, the extraction frame is not limited to a rectangular shape, and may have a polygonal shape, a circular shape, or a shape equal or similar to the shape of the subject 800 as long as the extraction frame has a size capable of including imaging of the subject 800. Furthermore, in the present embodiment, the specification of the ROI is not limited to being performed on the basis of the center coordinates, and may be performed, for example, on the basis of the detected contour of imaging of each subject 800 and is not particularly limited.
[0095] Further, in the present embodiment, the imaging region specifying unit 144 may specify each ROI by specifying an imaging position of an identification marker (not illustrated in the drawings) provided on the surface of the subject 800 in the one frame image without using the two-step color tone image. Furthermore, in the present embodiment, the imaging region specifying unit 144 may specify each ROI on the basis of a plurality of predetermined regions (for example, the coordinates of each vertex of the region are set in advance) designated in advance by the user in the one frame image without using the two-step color tone image.
[0096] Then, the combining processing unit 146 cuts each ROI from the one frame image on the basis of each ROI specified by the imaging region specifying unit 144 and superimposes a plurality of cut ROIs to generate a combined image. Specifically, the combining processing unit 146 performs position adjustment such that the center of imaging of the subject 800 included in each ROI and the contour are matched with each other, and superimposes the plurality of ROIs to generate a combined image. Note that, in the present embodiment, the combining processing unit 146 may generate a combined image by superimposing the plurality of ROIs in a state in which imaging of identification markers (not illustrated in the drawings) provided on the surface of the subject 800, included in the respective ROIs, are matched with each other, and the combining processing unit 146 is not particularly limited. Furthermore, when the combined image is generated, the combining processing unit 146 can generate pseudo color images with reference to color information (for example, red, green, and blue in a visible light band are assigned) assigned in advance to each of the wavelengths (for example, the wavelengths λ.sub.1 to λ.sub.7). In the present embodiment, a combined image of the pseudo color images is generated as described above, so that visibility of details in the image can be improved. Note that details of generation of the pseudo color image will be described later.
[0097] —Control unit 150—
[0098] The control unit 150 can control the imaging unit 130 such that the reflected light is received in synchronization with the irradiation of the irradiation unit 110. The control unit 150 is realized by hardware such as a CPU, a ROM, and a RAM, for example.
[0099] As described above, according to the imaging module 100 according to the present embodiment, a large number of optical components such as a diffraction grating and a mirror are not required, and it is possible to avoid a complicated configuration and an increase in manufacturing cost of the imaging module 100. That is, according to the present embodiment, the imaging module 100 having a simple configuration can be provided.
[0100] <2.3 Imaging Method>
[0101] The imaging system 10 according to the present embodiment and the configurations of the devices included in the imaging system 10 have been described in detail above. Next, an imaging method according to the present embodiment will be described with reference to
[0102] (Step S101)
[0103] For example, the control unit 150 monitors the traveling speed (for example, constant speed control) of the belt conveyor 300 or the position of the subject 800 on the belt conveyor 300 in cooperation with the control server 200.
[0104] (Step S103)
[0105] The control unit 150 determines whether or not the subject 800 reaches a photographing start position. In the present embodiment, when the subject 800 reaches the photographing start position, the process proceeds to a next step S105, and when the subject 800 does not reach the photographing start position, the process returns to the previous step S101. That is, in the present embodiment, as described above, the irradiation/light reception operation is performed in synchronization with the traveling of the belt conveyor 300. Note that the present embodiment is not limited to the process proceeding with the arrival of the subject 800 at the photographing start position as a trigger, and other events or the like may be used as a trigger. Further, in the present embodiment, the event serving as the trigger may be acquired from each device in the imaging system 10 or may be acquired from a device outside the imaging system 10, and the acquisition of the event is not particularly limited.
[0106] (Step S105)
[0107] The control unit 150 controls the light emitting element 112 corresponding to the position of the subject 800 and causes the light emitting element 112 to irradiate the subject 800 with light (for example, near infrared light having a predetermined wavelength) having a predetermined wavelength (for example, the wavelengths λ.sub.1 to λ.sub.7). Specifically, as illustrated in
[0108] (Step S107)
[0109] The control unit 150 controls the plurality of imaging elements 134 such that the plurality of imaging elements 134 receive the reflected light from the subject 800, in synchronization with the irradiation of the light emitting element 112 in step S105. For example, as illustrated in
[0110] (Step S109)
[0111] The control unit 150 controls the corresponding light emitting element 112 and ends the irradiation.
[0112] (Step S111)
[0113] The control unit 150 determines whether or not all the light emitting elements 112 perform irradiation. In the present embodiment, when all the light emitting elements 112 perform the irradiation, the process proceeds to a next step S113, and when all the light emitting elements 112 do not perform the irradiation, the process returns to the previous step S105.
[0114] That is, in the present embodiment, as illustrated in the upper part of
[0115] (Step S113)
[0116] The control unit 150 controls the reading unit 138, collectively reads the imaging signals stored in each memory unit 136, acquires a one frame image including the imaging 802 (spectral image) of the subject 800 corresponding to each of the wavelengths (for example, the wavelengths λ.sub.1 to λ.sub.7), and outputs the acquired one frame image to the combining unit 140 (global shutter system).
[0117] (Step S115)
[0118] The control unit 150 controls the binarization processing unit 142 of the combining unit 140 and performs binarization processing of converting the one frame image acquired in step S113 into a two-step color tone to generate a two-step color tone image. For example, in step S115, a black-and-white image illustrated in the middle part of
[0119] (Step S117)
[0120] As illustrated in the lower part of
[0121] (Step S119)
[0122] As illustrated in the lower part of
[0123] Hereinafter, an example of a method for generating the pseudo color image in the present embodiment will be described. Here, for easy understanding, a method for generating the pseudo color image will be described assuming that a one frame image including the imaging 802 (spectral image) of the subject 800 corresponding to the three wavelengths λ.sub.1 to λ.sub.3 is acquired.
[0124] In this example, it is assumed that, as the color information, red is assigned in advance to the wavelength λ.sub.1, green is assigned in advance to the wavelength λ.sub.2, and blue is assigned in advance to the wavelength λ.sub.3. Furthermore, in the present embodiment, it is assumed that the plurality of imaging elements 134 of the imaging module 100 can detect visible light when the pseudo color image is generated. That is, it is assumed that, on a light receiving surface of the imaging module 100 where the plurality of imaging elements 134 are disposed in a matrix, an imaging element that detects red, an imaging element that detects green, and an imaging element that detects blue are arranged in a Bayer array.
[0125] Then, in the present embodiment, under the above-described assignment and assumption conditions, the pseudo color images can be combined as follows. Specifically, first, as illustrated in the upper left part of
[0126] Note that, in the present embodiment, the combining of the pseudo color images 806 is not limited to the above-described example, and may be performed using another method such as using an addition average for each pixel such as a color parameter.
[0127] (Step S121)
[0128] The control unit 150 controls the combining processing unit 146 and outputs the generated combined image to the control unit 150. Furthermore, the output combined image is converted into an appropriate format by the control unit 150 and is output to the control server 200 or the like, for example.
[0129] As described above, according to the present embodiment, since the combined image is generated using a one frame image, it is possible to generate an image obtained by combining a plurality of spectral images into one image in an imaging time for one frame. That is, according to the present embodiment, it is possible to obtain a combined image of spectral images at high speed.
[0130] <2.4 First Modification>
[0131] In the present embodiment, as a modification, as described above, each ROI can be specified on the basis of a plurality of predetermined regions designated in advance by the user in the one frame image without using the two-step color tone image.
[0132] <2.5 Second Modification>
[0133] Note that, in the above description, the combined image is generated by superimposing the plurality of cut ROIs, but the present embodiment and the first modification are not limited thereto. For example, in the present embodiment and the first modification, the one frame image may be output, or the ROIs cut from the one frame image may be output. For example, in this case, since images corresponding to the respective wavelengths of light can be separately acquired and analyzed, it is possible to easily recognize the presence or absence, distribution, and the like of components for the corresponding wavelengths.
3. Second Embodiment
[0134] In the first embodiment described above, a plurality of light emitting elements 112 are provided in order to emit light having different wavelengths. However, by using a plurality of filters 162 (see
[0135] <3.1 Detailed Configuration of Imaging Module>
[0136] First, a detailed configuration of an imaging module 100 according to the second embodiment of the present disclosure will be described. In the following description, descriptions of points common to those of the above-described first embodiment will be omitted, and only different points will be described. In the present embodiment, the irradiation unit 110 of the imaging module 100 has the light emitting element 112d (see
[0137] Further, an imaging unit 130 of an imaging device 120 according to the present embodiment further has the filter unit 160 (see
[0138] Further, in the present embodiment, for example, with the arrival of the subject 800 at the imaging position by the belt conveyor 300 as a trigger, imaging of a global shutter system (multiple exposure) by the imaging unit 130 is performed.
[0139] As described above, in the present embodiment, since it is sufficient if there is one type of light emitting element 112d by using the plurality of filters 162, it is possible to avoid a complicated configuration of the irradiation unit 110 and an increase in manufacturing cost of the irradiation unit 110. Further, in the present embodiment, the irradiation unit 110 can be removed by using a general indoor light, natural light, or the like. That is, according to the present embodiment, the irradiation unit 110 having a special configuration is unnecessary, and a general lighting device or the like can be used.
[0140] <3.2 Imaging Method>
[0141] The detailed configuration of the imaging module 100 according to the present embodiment has been described above. Next, the imaging method according to the present embodiment will be described with reference to
[0142] First, in the present embodiment, the above-described light emitting element 112d starts light irradiation.
[0143] (Steps S201 to S205)
[0144] Since Steps S201 to S205 according to the present embodiment are similar to steps S101, S103, and S107 according to the first embodiment illustrated in
[0145] (Step S207)
[0146] The control unit 150 determines whether or not the imaging unit 130 receives reflected light of all wavelengths. In the present embodiment, when the reflected light of all the wavelengths is received, the process proceeds to a next step S209, and when the reflected light of all the wavelengths is not received, the process returns to the previous step S205.
[0147] In the present embodiment, as illustrated in the upper part of
[0148] (Steps S209 to S217)
[0149] Since steps S209 to S217 according to the present embodiment are similar to steps S113 to S121 according to the first embodiment illustrated in
4. Third Embodiment
[0150] In the first embodiment described above, a one frame image including imaging 802 of a subject 800 corresponding to reflected light of each wavelength (for example, wavelengths λ1 to λ.sub.7) is converted into a two-step color tone image, and a position of the imaging 802 of each subject 800 in the one frame image is specified. On the other hand, in a third embodiment of the present disclosure described below, in order to further improve the accuracy of the position detection, the position of the imaging 802 of the subject 800 corresponding to the reflected light of each wavelength other than visible light may be specified using a one frame image including the imaging 802 of the subject 800 corresponding to the reflected light of the visible light (reference light). In this way, in the present embodiment, since a visible light image or the like in which the contour of the imaging 802 is easily detected in the two-step color tone image can be used, the position of the imaging 802 of the subject 800 corresponding to reflected light of near infrared light or the like other than the visible light can be accurately detected. Therefore, the third embodiment will be described with reference to
[0151] <4.1 Detailed Configuration of Imaging Module>
[0152] First, a detailed configuration of an imaging module 100 according to the present embodiment will be described. In the following description, descriptions of points common to those of the above-described first embodiment will be omitted, and only different points will be described. In the present embodiment, as compared with the first embodiment, an irradiation unit 110 further has a light emitting element (reference light emitting element) 112f (see
[0153] Even in the present embodiment, for example, with the arrival of the subject 800 at the imaging position by a belt conveyor 300 as a trigger, irradiation by the irradiation unit 110 including the light emitting element 112f and an imaging of a global shutter system (multiple exposure) by the imaging unit 130 are performed in synchronization with each other.
[0154] <4.2 Imaging Method>
[0155] The detailed configuration of the imaging module 100 according to the present embodiment has been described above. Next, an imaging method according to the present embodiment will be described with reference to
[0156] (Steps S301 and S303)
[0157] Since steps S301 and S303 according to the present embodiment are similar to steps S101 and S103 according to the first embodiment illustrated in
[0158] (Step S305)
[0159] The control unit 150 controls the light emitting element 112 corresponding to the position of the subject 800 and alternately irradiates the subject 800 with light having visible light (for example, the wavelength λ.sub.ref) and near infrared light having a predetermined wavelength (for example, the wavelengths λ.sub.1 to λ.sub.7). Specifically, as illustrated in
[0160] (Step S307)
[0161] The control unit 150 controls the plurality of imaging elements 134 such that the plurality of imaging elements 134 receive the reflected light from the subject 800, in synchronization with the irradiation of the light emitting element 112 in step S305. For example, as illustrated in FIG. 12, the plurality of imaging elements 134 perform light reception in synchronization with the irradiation of the light emitting elements 112a, 112b, and 112f, generate imaging 802 of the a subject obtained by the light reception as an imaging signal, and output the generated imaging signal to each memory unit 136. Furthermore, each memory unit 136 temporarily holds the imaging signal. Note that the imaging signal corresponds to the imaging 802 of the subject 800 corresponding to each wavelength (for example, the wavelengths λ.sub.1 to λ.sub.7 and λ.sub.ref) of the light emitted by the light emitting elements 112a, 112b, and 112f in step S305, and each imaging 802 is included in a one frame image to be described later.
[0162] (Steps S309 to S313)
[0163] Since steps S309 to S313 according to the present embodiment are similar to steps S109 to S113 according to the first embodiment illustrated in
[0164] (Step S315)
[0165] The control unit 150 controls the binarization processing unit 142 of the combining unit 140 and performs binarization processing of converting the one frame image acquired in step S313 into a two-step color tone to generate a two-step color tone image (for example, a black-and-white image is generated).
[0166] Furthermore, as illustrated in the middle part of
[0167] (Steps S317 to S321)
[0168] Since steps S317 to S321 according to the present embodiment are similar to steps S117 to S121 according to the first embodiment illustrated in
[0169] As described above, in the present embodiment, the position of the imaging 802 of the subject 800 corresponding to the near infrared light or the like can be accurately detected by calculating the center of the two imaging 802 using the two imaging 802 corresponding to the visible light in which it is easy to detect the contour in the two-step color tone image.
5. Fourth Embodiment
[0170] In the first embodiment described above, a position of imaging 802 of a subject 800 corresponding to near infrared light or the like is detected using a two-step color tone image. On the other hand, in an embodiment described below, when the position of the imaging 802 of the subject 800 in the one frame image is known in advance, only an imaging signal from a corresponding pixel (imaging element 134) in each ROI may be acquired from the beginning on the basis of a plurality of predetermined regions designated in advance by a user in the one frame image. In this way, in the present embodiment, it is possible to reduce an amount of imaging signals to be collectively read by a reading unit 138 and to reduce the burden of subsequent processing. Therefore, a fourth embodiment of the present disclosure will be described with reference to
[0171] <5.1 Detailed Configuration of Imaging Module>
[0172] A detailed configuration of an imaging module 100 according to the present embodiment is common to the first embodiment described above, except that a combining unit 140 is not provided with a binarization processing unit 142 and an imaging region specifying unit 144. Therefore, description thereof is omitted here.
[0173] <5.2 Imaging Method>
[0174] Next, the imaging method according to the present embodiment will be described with reference to
[0175] (Steps S401 to S405)
[0176] Since steps S401 to S405 according to the present embodiment are similar to steps S101 to S105 according to the first embodiment illustrated in
[0177] (Step S407)
[0178] Similarly to the first embodiment, a control unit 150 controls a plurality of imaging elements 134 such that the plurality of imaging elements 134 receive the reflected light from the subject 800, in synchronization with the irradiation of a light emitting element 112 in step S405. In the present embodiment, as illustrated in
[0179] (Steps S409 and S411)
[0180] Since steps S409 and S411 according to the present embodiment are similar to steps S109 and S111 according to the first embodiment illustrated in
[0181] (Step S413)
[0182] The control unit 150 controls the combining processing unit 146 of the combining unit 140 and cuts each ROI included in the one frame image.
[0183] (Steps S415 and S417)
[0184] Since steps S415 and S417 according to the present embodiment are similar to steps S119 and S121 according to the first embodiment illustrated in
[0185] As described above, in the present embodiment, when the position of the imaging 802 of the subject 800 in the one frame image is known in advance, only the imaging signal from the corresponding pixel in each ROI 804 is acquired from the beginning on the basis of the plurality of predetermined regions designated in advance by the user in the one frame image. In this way, according to the present embodiment, it is possible to reduce the amount of imaging signals to be collectively read by the reading unit 138 and to reduce the burden of subsequent processing.
6. Fifth Embodiment
[0186] In the present disclosure, an imaging method according to the fourth embodiment described above can also be applied to an imaging module 100 according to the second embodiment. In this way, according to an embodiment described below, an amount of imaging signals collectively read by a reading unit 138 can be reduced, and the burden of subsequent processing can be reduced. In addition, an irradiation unit 110 having a special configuration is unnecessary, and a general lighting device or the like can be used. Therefore, a fifth embodiment of the present disclosure will be described with reference to
[0187] <6.1 Detailed Configuration of Imaging Module>
[0188] A detailed configuration of an imaging module 100 according to the present embodiment is common to the second embodiment described above, except that a combining unit 140 is not provided with a binarization processing unit 142 and an imaging region specifying unit 144. Therefore, description thereof is omitted here.
[0189] <6.2 Imaging Method>
[0190] Next, the imaging method according to the present embodiment will be described with reference to
[0191] (Steps S501 and S503)
[0192] Since steps S501 and S503 according to the present embodiment are similar to steps S101 and S103 according to the first embodiment illustrated in
[0193] (Step S505)
[0194] A control unit 150 controls a plurality of imaging elements 134 such that the plurality of imaging elements 134 receive reflected light from a subject 800. In the present embodiment, similarly to the above-described fourth embodiment, as illustrated in
[0195] (Step S507)
[0196] Since step S507 according to the present embodiment is similar to step S207 according to the second embodiment illustrated in
[0197] (Step S509)
[0198] Since step S509 according to the present embodiment is similar to step S413 according to the fourth embodiment illustrated in
[0199] (Steps S511 and S513)
[0200] Since steps S511 and S513 according to the present embodiment are similar to steps S119 and S121 according to the first embodiment illustrated in
[0201] As described above, according to the present embodiment, the amount of imaging signals collectively read by the reading unit 138 can be reduced, and the burden of subsequent processing can be reduced. In addition, the irradiation unit 110 having the special configuration is unnecessary, and a general lighting device or the like can be used.
7. Sixth Embodiment
[0202] The technology according to the embodiment of the present disclosure described above is applicable to all electronic apparatuses using an imaging apparatus in an image capturing unit, such as an imaging apparatus such as a digital still camera or a video camera, a mobile terminal device having an imaging function, and a copying machine using an imaging apparatus in an image reading unit. Furthermore, the technology of the present disclosure is also applicable to automobiles, robots, aircraft, drones, various inspection apparatuses (for example, food inspection or the like), medical devices (endoscopes), and the like. Hereinafter, an example of an electronic apparatus 900 to which the technology according to the embodiment of the present disclosure is applied will be described as a sixth embodiment of the present disclosure with reference to
[0203] As illustrated in
[0204] Note that the embodiment of the present disclosure is not limited to being applied to an inspection device that inspects, in a manufacturing line installed at a manufacturing site or the like, the presence or absence of scratches, the presence or absence of mixing of a foreign material, and whether or not an appearance of a manufactured product is an acceptable product suitable for shipment, on the basis of an image of the appearance of the product. For example, the present embodiment can be applied to appearance inspection of an industrial product (presence or absence of scratches and determination of shipment conformity of appearance of a manufactured product) and the like. In addition, since the present embodiment can use light of various wavelengths, the present embodiment can also be used, for example, for foreign material mixing inspection of pharmaceuticals and foods or the like, on the basis of light absorption characteristics specific to substances (light absorption characteristics specific to a foreign material can be used). Furthermore, in the present embodiment, since light of various wavelengths can be used, for example, color recognition, a depth at which a scratch or a foreign material is located, and the like, which are rarely recognized with visible light, can be detected.
8. Application to Mobile Object
[0205] Further, in the embodiment of the present disclosure, instead of moving the subject 800, an imaging module 100 may be mounted on a mobile object to cause the imaging module 100 side to move. For example, when the imaging module 100 is mounted on a mobile object such as a drone, light of a predetermined wavelength may be emitted in a case where the subject 800 is positioned directly below a light emitting element 112 of the imaging module 100. In this case, according to the present embodiment, it is possible to detect a distribution of a specific component on a ground surface or the like and detect a state of a crop. That is, the imaging module 100 according to the present embodiment may be realized as a device mounted on any kind of mobile object, such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal mobility, an airplane, a drone, a ship, or a robot.
[0206]
[0207] A vehicle control system 12000 includes a plurality of electronic control units connected via a communication network 12001. In the example illustrated in
[0208] The drive system control unit 12010 controls the operation of the devices related to the drive system of the vehicle according to various programs. For example, the drive system control unit 12010 functions as a driving force generation device for generating a driving force of a vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to wheels, a steering mechanism for regulating the steering angle of the vehicle, and a control device such as a braking device for generating a braking force of the vehicle.
[0209] The body system control unit 12020 controls the operation of various devices mounted on the vehicle body according to various programs. For example, the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as headlamps, rear lamps, brake lamps, blinkers, or fog lamps. In this case, the body system control unit 12020 can receive radio waves transmitted from a portable device that substitutes for a key, or signals from various switches. The body system control unit 12020 receives input of these radio waves or signals, and controls a door lock device, a power window device, lamps, and the like of the vehicle.
[0210] The vehicle exterior information detection unit 12030 detects information outside the vehicle on which the vehicle control system 12000 is mounted. For example, an imaging unit 12031 is connected to the vehicle exterior information detection unit 12030. The vehicle exterior information detection unit 12030 causes the imaging unit 12031 to capture an image outside the vehicle and receives the captured image. The vehicle exterior information detection unit 12030 may perform object detection processing of a person, a car, an obstacle, a sign, or characters on a road surface, or distance detection processing, on the basis of the received image.
[0211] The imaging unit 12031 is an optical sensor that receives light and outputs an electric signal corresponding to the amount of received light. The imaging unit 12031 can output an electric signal as an image or as distance measurement information. In addition, the light received by the imaging unit 12031 may be visible light or invisible light such as infrared rays.
[0212] The vehicle interior information detection unit 12040 detects vehicle interior information. For example, a driver state detection unit 12041 that detects the state of the driver is connected to the vehicle interior information detection unit 12040. The driver state detection unit 12041 includes, for example, a camera that captures the driver, and the vehicle interior information detection unit 12040 may calculate, in accordance with the detected information input from the driver state detection unit 12041, the degree of tiredness or concentration of the driver or determine whether or not the driver is asleep.
[0213] A microcomputer 12051 is able to calculate a control target value of the driving force generation device, the steering mechanism, or the braking device, on the basis of the information inside and outside the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, to output a control command to the drive system control unit 12010. For example, the microcomputer 12051 can perform cooperative control for the purpose of implementing advanced driver assistance system (ADAS) functions including vehicle collision avoidance or impact mitigation, tracking based on inter-vehicle distance, vehicle speed maintenance, vehicle collision warning, or vehicle lane departure warning.
[0214] In addition, the microcomputer 12051 can also perform cooperative control for the purpose of automatic driving to travel the vehicle autonomously without relying on the operation of the driver by controlling the driving force generation device, the steering mechanism, or the braking device in accordance with the information around the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040.
[0215] Further, the microcomputer 12051 can output a control command to the body system control unit 12020 on the basis of the information outside the vehicle acquired by the vehicle exterior information detection unit 12030. For example, the microcomputer 12051 can control the headlamps according to the position of the preceding vehicle or oncoming vehicle detected by the vehicle exterior information detection unit 12030, and can perform cooperative control for the purpose of anti-glare, such as switching a high beam to a low beam.
[0216] The audio image output unit 12052 transmits an output signal of at least one of audio and image to an output device capable of visually or audibly providing information to a vehicle occupant or the outside of the vehicle. In the example of
[0217]
[0218] In
[0219] The imaging units 12101, 12102, 12103, 12104, and 12105 are provided, for example, at positions including a front nose, a side mirror, a rear bumper, a rear door, and an upper portion of a windshield in the vehicle interior of the vehicle 12100. The imaging unit 12101 provided at the front nose and the imaging unit 12105 provided at the upper portion of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 12100. The imaging units 12102 and 12103 provided at the side mirrors mainly acquire images of the side of the vehicle 12100. The imaging unit 12104 provided at the rear bumper or the rear door mainly acquires an image behind the vehicle 12100. The front images acquired by the imaging units 12101 and 12105 are mainly used for detecting a preceding vehicle, a pedestrian, an obstacle, a traffic signal, a traffic sign, a lane, or the like.
[0220] Note that
[0221] At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information. For example, at least one of the imaging units 12101 to 12104 may be a stereo camera including a plurality of imaging elements, or may be an imaging element having pixels for phase difference detection.
[0222] For example, the microcomputer 12051 uses the distance information obtained from the imaging units 12101 to 12104 to calculate the distance to a three-dimensional object in the imaging ranges 12111 to 12114 and the temporal change of the distance (relative speed with respect to the vehicle 12100), so that it is possible to extract, particularly as a preceding vehicle, the closest three-dimensional object on the traveling path of the vehicle 12100 and the three-dimensional object that travels at a predetermined speed (e.g., 0 km/h or more) in substantially the same direction as the vehicle 12100. Further, the microcomputer 12051 can set an inter-vehicle distance to be secured in advance before the preceding vehicle, and can perform automatic brake control (including follow-up stop control), automatic acceleration control (including follow-up start control), and the like. In this way, it is possible to perform the cooperative control for the purpose of automatic driving or the like to travel autonomously without relying on the operation of the driver.
[0223] For example, the microcomputer 12051 can classify three-dimensional object data related to the three-dimensional object, on the basis of the distance information obtained from the imaging units 12101 to 12104, extracts other three-dimensional objects such as two-wheeled vehicles, ordinary vehicles, large vehicles, pedestrians, and power poles, and uses the extracted data for automatic avoidance of obstacles. For example, the microcomputer 12051 distinguishes obstacles around the vehicle 12100 between obstacles visible to the driver of the vehicle 12100 and obstacles difficult to recognize visually. In addition, the microcomputer 12051 determines the collision risk indicating the risk of collision with each obstacle and, if the collision risk is a setting value or more and indicates the possibility of collision, the microcomputer 12051 can assist driving to avoid collision by outputting an alarm to the driver via the audio speaker 12061 or the display unit 12062, or executing forced deceleration or avoidance steering via the drive system control unit 12010.
[0224] At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays. For example, the microcomputer 12051 can recognize a pedestrian by determining whether or not a pedestrian is present in the captured images of the imaging units 12101 to 12104. Such pedestrian recognition is carried out, for example, by determining whether or not a person is a pedestrian by performing pattern matching processing on a sequence of feature points indicating a contour of the object and a procedure for extracting feature points in the captured images of the imaging units 12101 to 12104 as infrared cameras. When the microcomputer 12051 determines that the pedestrian exists in the captured images of the imaging units 12101 to 12104 and recognizes the pedestrian, the audio image output unit 12052 controls the display unit 12062 to display a rectangular contour line for emphasizing the recognized pedestrian in a superimposed manner. Further, the audio image output unit 12052 may control the display unit 12062 so as to display an icon or the like indicating a pedestrian at a desired position.
[0225] The example of the vehicle control system to which the technology according to the present disclosure can be applied has been described above. The technology according to the present disclosure is applicable to the imaging unit 12031 and the like among the configurations described above.
9. Summary
[0226] As described above, according to each embodiment of the present disclosure, it is possible to avoid a complicated configuration and an increase in manufacturing cost of the imaging module 100, without requiring a large number of optical components such as a diffraction grating and a mirror. Furthermore, according to each embodiment of the present disclosure, since a combined image is generated using a one frame image, it is possible to generate an image obtained by combining a plurality of spectral images into one image in an imaging time for one frame. That is, according to the present embodiment, it is possible to obtain a combined image of spectral images with a simple configuration and at high speed.
[0227] Note that, in the embodiments and the modifications of the present disclosure described above, it has been described that a combined image is generated by superimposing a plurality of cut ROIs. However, each of the embodiments and modifications of the present disclosure is not limited thereto. For example, in each embodiment, the one frame image may be output, or the ROI cut from the one frame image may be output. For example, in this case, since images corresponding to respective wavelengths of light can be separately acquired and analyzed, it is possible to easily recognize the presence or absence, distribution, and the like of components for the corresponding wavelengths.
10. Supplement
[0228] Note that the embodiment of the present disclosure described above can include, for example, a program for causing a computer to function as the imaging system 10 according to the present embodiment, and a non-transitory tangible medium on which the program is recorded. In addition, the program may be distributed via a communication line (including wireless communication) such as the Internet.
[0229] Further, each step in the imaging method according to the embodiment of the present disclosure described above may not necessarily be processed in the described order. For example, each step may be processed in appropriately changed order. In addition, each step may be partially processed in parallel or individually instead of being processed in time series. Furthermore, the processing method of each step may not necessarily be processed according to the described method, and may be processed by another method by another functional unit, for example.
[0230] The preferred embodiments of the present disclosure have been described in detail with reference to the accompanying drawings, but the technical scope of the present disclosure is not limited to such examples. It is obvious that a person with an ordinary skill in a technological field of the present disclosure could conceive of various alterations or corrections within the scope of the technical ideas described in the appended claims, and it should be understood that such alterations or corrections will naturally belong to the technical scope of the present disclosure.
[0231] Furthermore, the effects described in the present specification are merely illustrative or exemplary and are not restrictive. That is, the technology according to the present disclosure can exhibit other effects obvious to those skilled in the art from the description of the present specification in addition to or in place of the above effects.
[0232] Note that the present technology can also take the following configurations.
(1) An imaging device comprising:
[0233] an imaging unit that generates a one frame image by sequentially receiving each reflected light reflected by a subject by intermittently and sequentially irradiating the subject with each irradiation light having a different wavelength according to a position of the moving subject, temporarily and sequentially holding signal information based on the reflected light of each wavelength, and collectively reading the held signal information; and
[0234] a combining unit that generates a combined image by cutting a subject image corresponding to the reflected light of each wavelength from the one frame image and superimposing a plurality of the cut subject images.
(2) The imaging device according to (1), wherein
[0235] the imaging unit has a plurality of pixels, and
[0236] each of the pixels includes
[0237] an imaging element that receives the reflected light to generate the signal information, and
[0238] a memory unit that temporarily holds the signal information from the imaging element.
(3) The imaging device according to (2), wherein
[0239] the imaging unit operates in a global shutter system that collectively reads the signal information held in the each memory unit.
(4) The imaging device according to (2) or (3), wherein
[0240] the pixel includes an InGaAs imaging element that detects near infrared light.
(5) The imaging device according to any one of (2) to (4), wherein
[0241] the combining unit cuts each subject image corresponding to the reflected light of each wavelength by cutting a plurality of predetermined regions designated in advance from the one frame image.
(6) The imaging device according to any one of (2) to (4), wherein
[0242] the combining unit has an imaging region specifying unit that specifies each subject image corresponding to the reflected light of each wavelength in the one frame image.
(7) The imaging device according to (6), wherein
[0243] the combining unit further has a binarization processing unit that converts the one frame image into a two-step color tone to generate a two-step color tone image, and
[0244] the imaging region specifying unit specifies each subject image corresponding to the reflected light of each wavelength, on the basis of the two-step color tone image.
(8) The imaging device according to (6), wherein
[0245] the imaging unit sequentially receives each reference light reflected by the subject by intermittently and sequentially irradiating, with the each irradiation light, the subject moving at a constant speed along a predetermined direction before and after irradiation with the each irradiation light,
[0246] the imaging unit generates the one frame image including a subject image corresponding to the reference light by temporarily and sequentially holding the signal information based on the reference light and collectively reading the held signal information, and
[0247] the imaging region specifying unit specifies a subject image corresponding to the reflected light of each wavelength located between subject images corresponding to the two reference lights, on the basis of the subject images corresponding to the two reference lights.
(9) The imaging device according to any one of (2) to (8), wherein
[0248] the combining unit has a combining processing unit that calculates a color parameter of each of the pixels in the subject image corresponding to the reflected light of each wavelength, on the basis of color information set in advance to correspond to the each wavelength and signal information of each of the pixels in the subject image corresponding to the reflected light of each wavelength, calculates an addition average of the color parameters in the plurality of subject images for each of the pixels, and generates a color image as the combined image on the basis of the calculated addition average.
(10) The imaging device according to any one of (1) to (9), wherein
[0249] the imaging unit has a plurality of filters that are provided to face the subject, are sequentially arranged along a moving direction of the subject, and transmit light of different wavelengths.
(11) The imaging device according to (10), wherein
[0250] the plurality of filters are an on-chip color filter or a plasmon filter.
(12) The imaging device according to any one of (1) to (9), further comprising:
[0251] an irradiation unit that intermittently and sequentially irradiates the subject with the irradiation light having a different wavelength according to a position of the moving subject.
(13) The imaging device according to (12), wherein
[0252] the irradiation unit has a plurality of light emitting elements that emit light of different wavelengths.
(14) The imaging device according to (13), wherein
[0253] the plurality of light emitting elements include a plurality of light emitting diodes that emit near infrared light.
(15) The imaging device according to (13) or (14), wherein
[0254] the plurality of light emitting elements include a reference light emitting element that emits reference light having a predetermined wavelength other than near infrared light.
(16) The imaging device according to (15), wherein
[0255] the reference light emitting element emits visible light as the reference light.
(17) The imaging device according to any one of (12) to (16), further comprising:
[0256] a control unit that controls the imaging unit such that the imaging unit receives the reflected light in synchronization with irradiation by the irradiation unit.
(18) An imaging device comprising:
[0257] an imaging unit that generates a one frame image by sequentially receiving each reflected light reflected by a subject by intermittently and sequentially irradiating the subject with each irradiation light having a different wavelength according to a position of the moving subject, temporarily and sequentially holding signal information based on the reflected light of each wavelength, and collectively reading a part of the signal information corresponding to a plurality of predetermined regions designated in advance in the held signal information; and
[0258] a combining unit that generates a combined image by cutting a subject image corresponding to the reflected light of each wavelength from the one frame image and superimposing a plurality of the cut subject images.
(19) An imaging system comprising:
[0259] a moving device that moves a subject;
[0260] an irradiation device that intermittently and sequentially irradiates the subject with irradiation light having a different wavelength according to a position of the moving subject;
[0261] an imaging apparatus that generates a one frame image by sequentially receiving each reflected light reflected by the subject by the irradiation, temporarily and sequentially holding signal information based on the reflected light of each wavelength, and collectively reading the held signal information; and
[0262] a combining device that generates a combined image by cutting a subject image corresponding to the reflected light of each wavelength from the one frame image and superimposing a plurality of the cut subject images.
(20) An imaging method comprising:
[0263] generating a one frame image by sequentially receiving each reflected light reflected by a subject by intermittently and sequentially irradiating the subject with each irradiation light having a different wavelength according to a position of the moving subject, temporarily and sequentially holding signal information based on the reflected light of each wavelength, and collectively reading the held signal information; and
[0264] generating a combined image by cutting a subject image corresponding to the reflected light of each wavelength from the one frame image and superimposing a plurality of the cut subject images.
TABLE-US-00001 Reference Signs List 10 IMAGING SYSTEM 100 IMAGING MODULE 110 IRRADIATION UNIT 112a, 112b, 112c, LIGHT EMITTING ELEMENT 112d, 112f 120 IMAGING DEVICE 130 IMAGING UNIT 132 LENS UNIT 134 IMAGING ELEMENT 136 MEMORY UNIT 138 READING UNIT 140 COMBINING UNIT 142 BINARIZATION PROCESSING UNIT 144 IMAGING REGION SPECIFYING UNIT 146 COMBINING PROCESSING UNIT 150 CONTROL UNIT 160 FILTER UNIT 162a, 162b, 162c FILTER 200 CONTROL SERVER 300 BELT CONVEYOR 410 PIXEL ARRAY UNIT 432 VERTICAL DRIVE CIRCUIT UNIT 434 COLUMN SIGNAL PROCESSING CIRCUIT UNIT 436 HORIZONTAL DRIVE CIRCUIT UNIT 438 OUTPUT CIRCUIT UNIT 440 CONTROL CIRCUIT UNIT 442 PIXEL DRIVE WIRE 444 VERTICAL SIGNAL LINE 446 HORIZONTAL SIGNAL LINE 448 INPUT/OUTPUT TERMINAL 480 PERIPHERAL CIRCUIT UNIT 500 SEMICONDUCTOR SUBSTRATE 800 SUBJECT 802 IMAGING 804 ROI 804a, 804b, 804c PIXEL DATA GROUP 806 PSEUDO COLOR IMAGE 900 ELECTRONIC APPARATUS 902 IMAGING APPARATUS 910 OPTICAL LENS 912 SHUTTER MECHANISM 914 DRIVE CIRCUIT UNIT 916 SIGNAL PROCESSING CIRCUIT UNIT