Imaging device and method
11054304 · 2021-07-06
Assignee
Inventors
Cpc classification
G01J3/0208
PHYSICS
A61B5/14546
HUMAN NECESSITIES
G01J3/0205
PHYSICS
A61B5/0075
HUMAN NECESSITIES
G01N21/314
PHYSICS
G01J3/027
PHYSICS
G01J1/0411
PHYSICS
G01J4/02
PHYSICS
A61B5/14532
HUMAN NECESSITIES
International classification
H01L27/00
ELECTRICITY
A61B5/00
HUMAN NECESSITIES
G01N21/31
PHYSICS
A61B5/1455
HUMAN NECESSITIES
Abstract
An imaging device and method are provided. Light from an object is provided as a plurality of sets of light beams to a phase difference array having a plurality of elements. The phase difference array is configured to provide different optical paths for light included within at least some of a plurality of sets of light beams. The light from the phase difference array is received at an imaging element array. The imaging element array includes a plurality of imaging elements. Information obtained from hyperspectral imaging data based on output signals of the imaging element array can be displayed.
Claims
1. An imaging device, comprising: a phase difference array having a plurality of phase difference elements arranged along a plane and configured to provide different optical paths for light provided as a plurality of sets of light beams simultaneously transmitted through the phase difference array, wherein the plurality of phase difference elements include a first phase difference element configured to generate a first optical path difference between a first portion of a first set of the plurality of sets of light beams incident on a first part of the first phase difference element and a second portion of the first set of light beams incident on a second part of the first phase difference element, and a second phase difference element configured to generate a second optical path difference between a first portion of a second set of the plurality of sets of light beams incident on a first part of the second phase difference element and a second portion of the second set of light beams incident on a second part of the first phase difference element, wherein the first phase difference element and the second phase difference element are arranged adjacent to each other along the plane; and an imaging element array including a plurality of imaging elements, wherein the plurality of imaging elements includes a first imaging element configured to receive the first set of light beams transmitted through the first phase difference element of the phase difference array and a second imaging element configured to receive the second set of light beams transmitted through the second phase difference element of the phase difference array.
2. The imaging device of claim 1, further comprising: an objective lens array, wherein the objective lens array includes a plurality of objective lenses, and wherein the objective lens array is configured to provide the plurality of sets of light beams to the phase difference array.
3. The imaging device of claim 2, wherein the plurality of sets of light beams provided by the objective lens array are cylindrical parallel light beams.
4. The imaging device of claim 1, wherein for the first phase difference and/or the second phase difference element, a thickness of the first part of the phase difference element is different than a thickness of the second part of the phase difference element.
5. The imaging device of claim 4, wherein the first part of the phase difference element has a semicircular area.
6. The imaging device of claim 4, wherein the first part of the phase difference element has a cylindrical area.
7. The imaging device of claim 4, wherein individual phase difference elements of the plurality of phase difference elements in the phase difference array have thicknesses of the first part that increase from a phase difference element at a first end of the phase difference array to a phase difference element at a second end of the phase difference array.
8. The imaging device of claim 1, further comprising: a polarizer array, wherein the polarizer array includes four different types of polarizers that differ from one another by at least 45 degrees.
9. The imaging device of claim 1, further comprising: an imaging lens array including a plurality of imaging lenses, wherein the imaging lens array is positioned between the phase difference array and the imaging element array.
10. The imaging device of claim 9, wherein the plurality of imaging lenses of the imaging lens array are configured to focus the plurality of sets of light beams transmitted through the phase difference array onto at least some of the imaging elements.
11. The imaging device of claim 1, wherein each of the plurality of imaging elements includes a plurality of pixels.
12. The imaging device of claim 1, wherein light from a first area of an imaged object is included in a first one of the plurality of sets of light beams, and wherein light from the first area of the imaged object is included in a second one of the plurality of sets of light beams.
13. The imaging device of claim 1, wherein the plurality of phase difference elements are arranged in a two-dimensional array.
14. A detection apparatus, comprising: a connecting structure; a light source, wherein the light source is connected to the connecting structure; an enclosure, wherein the enclosure is connected to the connecting structure, and wherein the enclosure includes: a phase difference array having a plurality of phase difference elements arranged along a plane and configured to provide different optical paths for light provided as a plurality of sets of light beams simultaneously transmitted through the phase difference array, wherein the plurality of phase difference elements include a first phase difference element configured to generate a first optical path difference between a first portion of a first set of the plurality of sets of light beams incident on a first part of the first phase difference element and a second portion of the first set of light beams incident on a second part of the first phase difference element, and a second phase difference element configured to generate a second optical path difference between a first portion of a second set of the plurality of sets of light beams incident on a first part of the second phase difference element and a second portion of the second set of light beams incident on a second part of the first phase difference element, wherein the first phase difference element and the second phase difference element are arranged adjacent to each other along the plane; an imaging element array including a plurality of imaging elements, wherein the plurality of imaging elements includes a first imaging element configured to receive the first set of light beams transmitted through the first phase difference element of the phase difference array and a second imaging element configured to receive the second set of light beams transmitted through the second phase difference element of the phase difference array; and a display, wherein the display is connected to the connecting structure, and wherein the display is operable to display detection information generated from data provided by the imaging element array.
15. The detection apparatus of claim 14, further comprising: an objective lens array, wherein the object lens array includes a plurality of objective lenses, and wherein the objective lens array is configured to provide the plurality of sets of light beams to the phase difference array.
16. The detection apparatus of claim 15, wherein the plurality of sets of light beams provided by the objective lens array are cylindrical parallel light beams.
17. The detection apparatus of claim 14, wherein for the first phase difference element and/or the second phase difference element, a thickness of the first part of the phase difference element is different than a thickness of the second part of the phase difference element.
18. The detection apparatus of claim 17, wherein the first part of the phase difference element has a semicircular area.
19. The detection apparatus of claim 17, wherein the first part of the phase difference element has a cylindrical area.
20. The detection apparatus of claim 17, wherein individual phase difference elements of the plurality of phase difference elements in the phase difference array have thicknesses of the first part that increase from a phase difference element at a first end of the phase difference array to a phase difference element at a second end of the phase difference array.
21. The detection apparatus of claim 14, wherein the enclosure further includes: a polarizer array, wherein the polarizer array includes four different types of polarizers that differ from one another by at least 45 degrees.
22. The detection apparatus of claim 14, wherein the connecting structure is a belt.
23. The detection apparatus of claim 14, wherein the plurality of phase difference elements are arranged in a two-dimensional array.
24. A method for detecting a physical property, comprising: emitting light onto an object; receiving light from the object at a plurality of phase difference elements included in a phase difference array, wherein the plurality of phase difference elements are arranged along a plane and configured to provide different optical paths for the received light simultaneously transmitted through the phase difference array, wherein the plurality of phase difference elements include a first phase difference element configured to generate a first optical path difference between a first portion of a light beam incident on a first part of the first phase difference element and a second portion of the light beam incident on a second part of the first phase difference element, and a second phase difference element configured to generate a second optical path difference between a first portion of a light beam incident on a first part of the second phase difference element and a second portion of the light beam incident on a second part of the first phase difference element, wherein the first phase difference element and the second phase difference element are arranged adjacent to each other along the plane; receiving light from the first and second phase difference elements at an imaging element array; and displaying information obtained from hyperspectral imaging (HIS) data based on output signals of the imaging element array.
Description
BRIEF DESCRIPTION OF DRAWINGS
(1)
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
(12)
(13)
(14)
(15)
(16)
DESCRIPTION OF EMBODIMENTS
Example of Applying Imaging Device of the Present Technology to Blood Test Apparatus
(17)
(18) More specifically, for example, as illustrated in an external view P1 in the upper left part in
(19) As illustrated in the upper right part of
(20) The HSI body 31 includes an imaging device that captures the HSI, is provided facing the arm 12 in a state where the blood test apparatus 11 is worn around the arm 12 by the belt 34, and captures a reflected image caused by the light emitted from the light sources 32-1 and 32-2 being reflected on the blood (artery blood) inside the artery 12a and the blood (vein blood) inside the vein 12b, inside the arm 12.
(21) In this case, the body 31 captures an image in which the artery 12a and the vein 12b are projected by, for example, light of a red wavelength to near-infrared light, in an image at a few millimeters under a skin, as illustrated in an image P12 in the lower left part of
(22) Further, the body 31 spectrally analyzes the blood in the captured artery 12a and the vein 12b, measures the oxygen concentration, the lipid level, and the blood glucose level in the blood, and displays the measurement result and information corresponding to the measurement result, on the display unit 33.
Configuration Example of Body
(23) Next, the configuration of the body 31 will be described with reference to the block diagram of
(24) The camera array 51 is configured with a plurality of, for example, m×n camera units that are arranged in the vertical direction and the horizontal direction. The same imaging area is cut and subjected to parallax correction such as an XY shift. The plurality of camera units capture interference images in which respective different wavelengths are emphasized, and output the interference images to the signal processing unit 52.
(25) Further,
(26) The signal processing unit 52 generates an interferogram by reading image signals which are supplied from respective camera units A and includes interference images in which different wavelengths are emphasized, in units of pixels at the same position. Further, the signal processing unit 52 generates data configured with spectroscopic spectra, by performing a Fourier transform on the interferogram in units of pixels. Then, the signal processing unit 52 analyzes the necessary components such as the oxygen concentration, the lipid level, and the blood glucose level in the blood based on the generated data configured with spectroscopic spectra, and displays the analysis result on the display unit 33. Here, during imaging, the signal processing unit 52 causes the light sources 32-1 and 32-2 to emit light. Further, in the following description, if the light sources 32-1 and 32-2 are distinguished, they are simply referred to as the light source 32, and other components are assumed to be referred to in the same manner.
DETAILED CONFIGURATION OF CAMERA ARRAY
(27) Next, the configuration of the camera array 51 will be described in detail with reference to
(28) The camera array 51 is configured with a lens array 71, a phase difference array 72, and a lens array 73, which are optical elements, and an imaging element array 74. Further, an apple image in
(29) The lens array 71 is, for example, an array of objective lenses which are configured for the respective camera units of a focal length f, and the lens array 71 converts the incident light into cylindrical parallel light beams with respect to the respective camera units A, and inputs the parallel light beams to the phase difference array 72. In particular, an objective lens is provided for each camera unit A of the imaging element array, and each of the objective lenses creates a set of parallel light beams from the incident light for its respective camera unit A.
(30) The phase difference array 72 includes a plurality of phase difference elements that are defined by light shielding portions 72a. For example, one phase difference element can be provided for each set of parallel light beams formed by the lens array 71. At least some of the elements of the phase difference array 72 include a filter that covers a portion of the parallel light beams incident from the lens array 71 with an object 72b having a predetermined refractive index. The elements of the phase difference array 72 associated with such an object 72b generate an optical path difference between the light beams passing through an area of the element covered with the object 72b and the light beams passing through an area of the element not converted with the object. The phase difference array 72 generates a phase difference corresponding to the optical path difference, and inputs the phase difference to the lens array 73 as an imaging lens array. In particular, the phase difference array 72 can include an element for or corresponding to each camera unit A. The phase differences are different for respective camera units A, and the phase difference may be zero in some cases. In addition, in
(31) The lens array 73 is an array of imaging lenses, and images the light flux with the phase difference added by the phase difference array 72, on the imaging element array 74, in units of the camera units A. In other words, the interference image is obtained.
(32) The imaging element array 74 is configured with complementary metal oxide semiconductor (CMOS) image sensors, captures different interference images in units of the camera units A, and outputs image signals of the captured interference images to the signal processing unit 52. In other words, the imaging element array 74 is a single imaging element as a whole, and the camera units A described above are obtained by classifying the pixels on the imaging element, for each imaging element area for capturing a unit image for capturing the same imaging area. Here, when obtaining the same imaging area, images are cut out for parallax correction, and subjected to an XY shift. Thus, the camera units A are not separate imaging elements, and a single imaging element area as a whole represents an area which is divided for each area of the predetermined number of pixels. That is, each of the camera units A includes a plurality of pixels. In addition, a monochrome imaging device without a color filter is used as the imaging element array 74.
(33) In other words, as illustrated in
(34) The lens array 73 as the imaging lens array images the light flux added with the phase difference on the imaging element array 74, in units of the camera units A, and causes the imaging elements constituting the imaging element array 74 to capture interference images in which different wavelengths are emphasized corresponding to the added phase difference, in units of the camera units A.
(35) Here, for example, as illustrated in
(36) In other words, for example, as illustrated in
(37) The thicknesses D of the object 72b which cause the phase differences are different for respective plurality of camera units A, and within the wavelength range to be measured, respective types of refractive index dispersion are assumed to be sufficiently small. Further, the object 72b may also be a reflection type, for example, with a 45-degree incidence. In addition, for the configuration of the reflection type of a 45-degree incidence, see Japanese Patent No. 5078004. In addition, in
(38) <Signal Processing Method>
(39) Next, with reference to
(40) Further, here, the number of pixels of respective camera units A included in the camera array 51 is assumed to be, for example, a QVGA (320×240 pixels). Further, for simplicity of explanation, it is assumed that the object is present at infinity and that the parallax for each camera unit is regarded as zero. In addition, pixels at a predetermined position in the interference images captured by respective camera units A constituting the camera array 51 are assumed to be the pixels at the same position within each of the interference images.
(41) As illustrated in the left part and the center part of
(42) In other words, the pixels P(1) to P(mn) at the same position in the respective camera units A(1) to A(mn) in
(43) Accordingly, the interferograms of the same number as the number of pixels corresponding to QVGA (320×240 pixels) are obtained.
(44) The spectroscopic spectrum of each pixel corresponding to each image of the camera unit is obtained as illustrated in the lower right part of
(45) Accordingly, through such a process, spectroscopic spectrum for the image of QVGA (320×240 pixels) is obtained. In the following description, the spectroscopic spectral data for each pixel of the image of QVGA (320×240 pixels) obtained in this manner is collectively referred to as an HSI Data cube. In addition, the image of an apple that is shown in the lower center part of
(46) For example, the number of camera units A is 8×8 in the horizontal direction×the vertical direction in the camera array 51, and when a total is 64, the number of all of the pixels is 64×QVGA=4.9 M pixels. Accordingly, it is the number of pixels which can be realized in a current commercial solid imaging element. Further, for example, if the cell size of the imaging element is assumed to be 3 micrometers, the sizes in the horizontal direction and vertical direction of the camera unit A are respectively about 1 mm, when the number is 8×8, since the entire size in the horizontal direction and the vertical direction respectively fit within about 10 mm, it is possible to sufficiently achieve miniaturization in practice. In fact, for example, the signal process is performed for each QVGA area of 8×8 on a single CMOS image sensor (CIS) described above of 4.9 M pixels or more, such that a process is performed for each area corresponding to 64 camera units A.
(47) <Specific Design Method>
(48) Next, examples of the number of camera units A (the number of a plurality of images which are set in the lens arrays 71 and 73), and a method of designing a phase difference step in the phase difference array will be specifically described.
(49) For example, as illustrated in
(50) For example, when measuring the spectral absorption characteristics of the oxygenated hemoglobin (HbO.sub.2) and the reduced hemoglobin (Hb) in the blood illustrated in
(51) The wavelength resolution obtained from the sampling theorem in a Fourier domain is (lambdac).sup.2/phase difference range. Since the necessary wavelength resolution is 25 nm, the phase difference range is (lambdac).sup.2/0.025=(0.665).sup.2/0.025=17.7 micrometers. In addition, the number of phase steps (in the lens arrays 71 and 73, the number of images having the same captured imaging area, in other words, the number of camera units A) is equal to or greater than the phase difference range/phase difference step=17.7/0.3=59. In other words, a phase difference array in which 8×8=64 rows of phase differences are formed at steps of 300 nm from 0 nm to 17.7 micrometers by the air conversion and 8×8 lens arrays are configured.
(52) Accordingly, since the phase difference is 300 nm by the air conversion in order to form such a phase difference array, when forming the object 72b illustrated in
(53) <Blood Test Process>
(54) Next, a blood test process using the blood test apparatus of
(55) In step S11, the signal processing unit 52 causes the light source 32 to emit light, and project the light to an area in which the artery 12a and the vein 12b of the arm 12 to be detected may be present.
(56) In step S12, each lens in the array 71 provided in the preceding row in the light incident direction transmits the incident light as the light corresponding to each camera unit A, such that the incident light is converted into sets of parallel light that are incident on the phase difference array 72.
(57) In step S13, the phase difference array 72 causes the light flux with the added phase difference to be incident on the lens array 73, with respect to each camera unit A.
(58) In step S14, each lens in the array 73 provided in the subsequent row in the incident direction of light passes the respective light fluxes incident from the phase difference array 72 so as to be imaged on the imaging element array 74.
(59) In step S15, the light receiving level of the interference image in each pixel of the imaging element array 74 is detected, and a pixel output which is a detection result is output to the signal processing unit 52, in units of the camera units A.
(60) In step S16, the signal processing unit 52 generates the data constituting the interferogram in units of pixels, based on the pixel signal in units of the camera units A that is supplied from the imaging element array 74.
(61) In step S17, the signal processing unit 52 performs Fast Fourier transform (FFT) on data obtained as an interferogram, and generates the hyper spectral imaging (HSI) data cube configured with data of the spectral spectrum for each pixel.
(62) In step S18, signal processing unit 52 extracts the spectroscopic spectra of the artery portion and the vein portion from the HSI Data cube which is spectroscopic spectral data, analyzes predetermined components in the blood, and displays the analyzed values on the display unit 33 as the test results. The HSI can change whether the analysis target is artery blood or vein blood, or whether to use both data, depending on the contents to be analyzed. For example, the signal processing unit 52 detects the oxygen concentration and the lipid level in the blood, based on the spectroscopic spectral data of the artery portion, detects the blood glucose level and the like, based on the spectroscopic spectral data of the vein portion, and displays the detected values on the display unit 33 as an analysis result.
(63) Since it is possible to obtain the spectroscopic spectrum by performing fast Fourier transform on the interference image by the above process, there is no energy loss such as a spectral filter, and it becomes possible to realize an HSI of high sensitivity. In addition, since a light source of a large output is not necessary by such a configuration, it is possible to miniaturize the configuration of the entire apparatus. Further, since it is possible to capture simultaneously the spectroscopic spectral data and all pixels in the image at high sensitivity, it is possible to realize capturing of a moving image by the HSI, by an inexpensive configuration without using a special material and a moving unit.
(64) <With Respect to Manufacturing Method>
(65) Next, a manufacturing method of the phase difference array 72 will be described with reference to the flowchart of
(66) In step S31, as illustrated in a state B in
(67) In step S32, as illustrated in a state C in
(68) In step S33, as illustrated in a state D in
(69) In step S34, as illustrated in a state E in
(70) In step S35, as illustrated in a state F in
(71) In step S36, as illustrated in a state G in
(72) Further, as illustrated in a state H in
(73) In step S37, as illustrated in a state I in
(74) In step S38, as illustrated in a state J in
(75) Further, in the case of using a permanent resist, in step S37, when steps 106a to 106d are formed in the resist layer 106, the process is terminated, and the process of step S38 is skipped. In other words, in this case, protrusions 106a to 106d are formed as a semicircular object 72b in
(76) By the above process, since the phase difference array 72 can be processed in a semiconductor process without using a special material, it becomes possible to realize a cost reduction of the camera array 51. Further, since it is possible to process a signal by area division in the imaging element, by forming the array structure on a common imaging element, the imaging element can substantially be a single element, therefore it is possible to realize cost reduction, and an increase in processing speed.
(77) In addition, in the above, the description has been made about the example of the application to the blood test apparatus that detects the components such as the oxygen concentration, lipid level, and the blood glucose level by using the spectroscopic spectral data obtained by the HSI Data cube, but as long as it can perform detection by using the spectroscopic spectral data, it may also be applied to other devices, for example, it may also be applied to various measurement technologies such as health care, beauty treatment, agriculture, food hygiene, and environmental measurements.
(78) Further, since the description has been made about the case where the object is present at the infinity, the description has been made that the parallax for each camera unit can be ignored, in fact, the parallax occurs in each camera unit. There is an assumption that an image to be captured by each camera unit is the same, it is possible to use the image captured by, for example, the camera unit in the vicinity of the center of the camera array 51, by cutting the image for each camera unit, and performing a parallax correction referred to as an XY shift, therefore it is possible to improve the accuracy the HSI image.
(79) <First Modification>
(80) In the above, the description has been made about the example of providing the optical path difference by the object 72b while being divided into the right and left in
(81) Further, as illustrated in the lower portion of
(82) <Second Modification>
(83) In the above, the description has been made about examples of acquiring only the HSI by the camera array 51, but as illustrated in the right part of
(84) In addition, areas other than the area where the non-phase difference array 151 is provided in the imaging element array 74 are used for the process for obtaining the HSI Data cube, and the area where the non-phase difference array 151 is provided is used for the process of generating a color image.
(85) With this configuration, it becomes possible to simultaneously obtain the color image and the HSI Data cube in the same imaging area, and use the color image and the HSI while superimposing them. Further, it is desirable that the color image is in the vicinity of the center of the camera array 51.
(86) Further, as illustrated in the left part of
(87) <Third Modification>
(88) The description has been made about examples of acquiring the HSI Data cube, or the HSI Data cube and the color image, and respective camera units are provided in the camera array 51 while the respective ends are separated in the horizontal direction and vertical direction, but stereo cameras as the camera units are provided in the respective ends, therefore it is possible to obtain a so-called depth image including depth information regarding depth distances in units of pixels in the camera unit. This makes it possible to simultaneously obtain the spectral information and depth information of the object.
(89) More specifically, as illustrated by arrows in
(90) <Fourth Modification>
(91) In the above, the description has been made about the example of simultaneously imaging the HSI image and the depth image, but it may be possible to obtain polarization information.
(92) In other words, as illustrated in the lower center part of
(93) More specifically, as illustrated in the upper left part of
(94) Further, as illustrated in the upper right part of
(95) In general, in the polarizer, Stokes parameters for representing the polarization state or Jones vectors are obtained by analyzing the polarization components of four orientations. Therefore, the Stokes parameters or the Jones vectors of each pixel are obtained, based on information regarding the camera units of the polarizers of the four orientations in the area of the phase difference array 72 which is the same optical path, by, for example, the signal processing unit 52, therefore it is possible to obtain the polarization state in units of pixels.
(96) If the size of the camera unit A is a QVGA pixel, the polarizing sheet which is formed by a general rolling process is cut to the size of the camera unit area, such as the size of about a 1 mm square, and the orientations of the cut sheets are changed, therefore it is possible to realize the configuration illustrated in
(97) With this configuration, it becomes possible to simultaneously obtain and superimpose the HSI image including the spectroscopic spectrum, the depth image, and the polarization image.
(98) However, the series of processes described above can be performed by hardware, but can also be performed by software. When the series of processes are performed by software, programs constituting the software are installed in a dedicated hardware built-in computer, or for example, a general-purpose personal computer capable of executing various functions by installing various programs, from the recording medium.
(99)
(100) An input unit 1006 including input devices such as a keyboard and a mouse through which the user inputs operation commands, an output unit 1007 that outputs processing operation screens and images resulting from processes on a display device, a storage unit 1008 including a hard disk drive that stores programs and various types of data, and a communication unit 1009 including a local area network (LAN) adapter that executes a communication process through a network represented by the Internet are connected to the input and output interface 1005. Further, a drive 1010 that reads and write data to a removable media 1011 such as a magnetic disk (including a flexible disk), an optical disk (including a compact disc-read only memory (CD-ROM), and a digital versatile disc (DVD)), a magneto-optical disk (including a mini disc (MD)), or a semiconductor memory is connected to the input and output interface 1005.
(101) The CPU 1001 performs various processes according to the program stored in the ROM 1002, or the program which is read from the removable media 1011 such as the magnetic disk, the optical disk, the magneto-optical disk, or the semiconductor memory and installed in the storage unit 1008, and read from the storage unit 1008 and loaded in the RAM 1003. The RAM 1003 appropriately stores data and the like necessary for the CPU 1001 to execute various processes.
(102) In the computer configured as described above, a series of processes described above are performed by the CPU 1001 loading, for example, the program stored in the storage unit 1008 on the RAM 1003 and executing the program, through the input and output interface 1005 and the bus 1004.
(103) The program that the computer (CPU 1001) executes may be provided by being recorded, for example, on the removable media 1011 as package media or the like. Further, the program may be provided through a wired or wireless transmission medium such as a local area network, the Internet, and digital satellite broadcasting.
(104) In the computer, the program may be installed in the storage unit 1008, through the input and output interface 1005, by mounting the removable media 1011 in the drive 1010. Further, the program may be received by the communication unit 1009 and installed in the storage unit 1008, through a wired or wireless transmission medium. Alternatively, the program may be installed in advance in the ROM 1002 or the storage unit 1008.
(105) In addition, the program that the computer executes may be a program in which processes are performed chronologically in the order described in the specification, or a program in which processes are performed in parallel or at a necessary timing, such as when a call is made.
(106) In the present specification, a system refers to the collection of a plurality of components (devices, modules (parts), and the like), and it does not matter whether all the components are in the same housing. Thus, a plurality of devices which are housed in separate housings and connected through a network, and one device including a plurality of modules which are housed in one housing are both the system.
(107) In addition, embodiments of the present technology are not limited to the embodiments described above, and various modifications are possible in a scope without departing from the spirit of the present technology.
(108) For example, the present technology may take a cloud computing configuration in which one function is shared by a plurality of devices to be processed jointly through the network.
(109) Further, each step described in the flowchart described above may be performed by being shared by a plurality of devices, as well as by one device.
(110) Further, when one step contains a plurality of processes, a plurality of processes included in the one step may be performed by being shared by a plurality of devices, as well as by one device.
(111) In addition, the present technology may have the following configurations.
(112) (1) An imaging device, comprising:
(113) a phase difference array with a plurality of elements, wherein the phase difference array is configured to provide different optical paths for light included within at least some of a plurality of sets of light beams; and
(114) an imaging element array including a plurality of imaging elements, wherein at least one of the imaging elements is configured to receive one of the sets of light beams from the phase difference array.
(115) (2) The imaging device according to (1) or (2), further comprising: an objective lens array, wherein the objective lens array includes a plurality of objective lenses, and wherein the objective lens array is configured to provide the plurality of sets of light beams to the phase difference array.
(116) (3) The imaging device according to (2), wherein the plurality of sets of light beams provided by the objective lens array are cylindrical parallel light beams.
(117) (4) The imaging device according to any one of (1) to (3), wherein at least some of the elements of the phase difference array are configured to generate an optical path difference between a first portion of a light beam incident on a first part of the element and a second portion of the light beam incident on a second part of the element.
(118) (5) The imaging device according to (4), wherein for the at least some of the elements of the phase difference array a thickness of the first part of the element is different than a thickness of the second part of the element.
(119) (6) The imaging element according to (5), wherein the first part of the element has a semicircular area.
(120) (7) The imaging device according to (5), wherein the first part of the element has a cylindrical area.
(121) (8) The imaging device according to (5), wherein the thickness of the first part of the element increases from an element at a first end of the phase difference array to an element at a second end of the phase difference array.
(122) (9) The imaging device according to any one of (1) to (8), further comprising: a polarizer array, wherein the polarizer array includes four different types of polarizers that differ from one another by at least 45 degrees.
(123) (10) The imaging device according to any one of (1) to (9), further comprising: an imaging lens array including a plurality of imaging lenses, wherein the imaging lens array is positioned between the phase difference array and the imaging element array.
(124) (11) The imaging device according to (10), wherein the imaging lenses of the imaging lens array image the plurality of sets of light beams onto at least some of the imaging elements.
(125) (12) The imaging device according to any one of (1) to (11), wherein each of the imaging elements includes a plurality of pixels.
(126) (13) The imaging device according to any one of (1) to (12), wherein light from a first area of an imaged object is included in a first one of the sets of light beams, and wherein light from the first area of the imaged object is included in a second one of the sets of light beams.
(127) (14) A detection apparatus, comprising: a connecting structure; a light source, wherein the light source is connected to the connecting structure; an enclosure, wherein the enclosure is connected to the connecting structure, and wherein the enclosure includes: a phase difference array with a plurality of elements, wherein the phase difference array is configured to provide different optical paths for light included within at least some of a plurality of sets of light beams; an imaging element array including a plurality of imaging elements, wherein at least one of the imaging elements is configured to receive one of the sets of light beams from the phase difference array; a display, wherein the display is connected to the connecting structure, and wherein the display is operable to display detection information generated from data provided by the imaging element array.
(128) (15) The detection apparatus according to (14), further comprising: an objective lens array, wherein the object lens array includes a plurality of objective lenses, and wherein the objective lens array is configured to provide the plurality of sets of light beams to the phase difference array.
(129) (16) The detection apparatus according to (15), wherein the plurality of sets of light beams provided by the objective lens array are cylindrical parallel light beams.
(130) (17) The detection apparatus according to any one of (14) to (16), wherein at least some of the elements of the phase difference array are configured to generate an optical path difference between a first portion of a light beam incident on a first part of the element and a second portion of the light beam incident on a second part of the element.
(131) (18) The detection apparatus according to (17), wherein for at least some of the elements of the phase difference array a thickness of the first part of the element of the phase difference array is different than a thickness of the second part of the element of the phase difference array.
(132) (19) The detection apparatus according to (18), wherein the first part of the element has a semicircular area.
(133) (20) The detection apparatus according to (18), wherein the first part of the element has a cylindrical area.
(134) (21) The detection apparatus according to (18), wherein a thickness of the first part of the element increases from an element at a first end of the phase difference array to an element at a second end of the phase difference array.
(135) (22) The detection apparatus according to any one of (14) to (21), wherein the enclosure further includes: a polarizer array, wherein the polarizer array includes four different types of polarizers that differ from one another by at least 45 degrees.
(136) (23) The detection apparatus according to any one of (14) to (22), wherein the connecting structure is a belt.
(137) (24) A method for detecting a physical property, comprising: emitting light onto an object; receiving light from the object at a plurality of phase difference elements included in a phase difference array, wherein at least some of the phase difference elements generate a phase difference from the light incident on the phase difference elements; receiving light from the phase difference elements at an imaging element array; displaying information obtained from hyperspectral imaging (HIS) data based on output signals of the imaging element array.
(138) (25) An imaging device including
(139) an imaging element array that captures a same imaging area, as a plurality of unit images; and
(140) a phase difference array that causes respective different optical path differences in a portion of respective imaging areas of the plurality of unit images which are captured by the imaging element array.
(141) (26) The imaging device according to (25),
(142) wherein the phase difference array includes a filter that causes the optical path differences in a semicircular shape for the respective imaging areas, and
(143) wherein the optical path differences are different for the respective imaging areas of the plurality of unit images.
(144) (27) The imaging device according to (25) or (26),
(145) wherein a filter constituting the phase difference array has sufficiently small refractive index dispersion in a wavelength range to be measured, or is a reflection type with incidence of 45 degrees.
(146) (28) The imaging device according to any one of (25) to (27),
(147) wherein the imaging element array captures images caused for the respective imaging areas by the phase difference array, as interference images.
(148) (29) The imaging device according to (28), further including
(149) a signal processing unit that generates an interferogram from output data of pixels at the same position of the respective interference images that are captured for the respective imaging areas by the imaging element array, and calculates spectral characteristics of the respective pixels as hyper spectral imaging (HSI) data cubes by performing Fourier transform on the interferogram.
(150) (30) The imaging device according to (29),
(151) wherein a phase difference of the phase difference array is set so as to monotonically increase or monotonically decrease, in a predetermined direction of the imaging areas which are arranged consecutively, and
(152) wherein the signal processing unit generates a depth image by using an image of an imaging area at one end and an image of an imaging area at the other end, in the predetermined direction of the phase difference array, as a stereo image.
(153) (31) The imaging device according to any one of (25) to (30),
(154) wherein optical elements in the respective imaging areas of the imaging element array are formed at a wafer level,
(155) wherein a lens array in a preceding row, a phase difference array, and a lens array in a subsequent row are defined as the optical elements, and
(156) wherein the imaging device further includes a camera array configured with the optical elements and the imaging element array.
(157) (32) The imaging device according to (31),
(158) wherein the imaging element array includes at least one or more imaging element areas for capturing a unit image of a monochrome image or an image generated by an RGB color filter, which does not have a phase difference and is not an interference image.
(159) (33) The imaging device according to (32),
(160) wherein the imaging element area is 4n (n is an integer of 1 or greater) times the imaging area of the unit image for the hyper spectral imaging (HSI) data cube.
(161) (34) The imaging device according to any one of (25) to (33),
(162) wherein one set of polarizers of four orientations is arranged for respective four imaging areas in the phase difference array, with respect to the camera array, and wherein the signal processing unit calculates a Stokes parameter or a Jones vector of each image point in the unit image, based on pixel signals of the imaging areas of the one set of polarizers.
(163) (35) An imaging method of an imaging device, the imaging device including an imaging element array that captures a same imaging area, as a plurality of unit images, and a phase difference array that causes respective different optical path differences in a portion of respective imaging areas of the plurality of unit images which are captured by the imaging element array, the method causing
(164) the imaging element array to capture the same imaging area, as the plurality of unit images, and
(165) the phase difference array to cause the respective different optical path differences in a portion of respective imaging areas of the plurality of unit images which are captured by the imaging element array.
(166) It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may occur depending on design requirements and other factors insofar as they are within the scope of the appended claims or the equivalents thereof.
REFERENCE SIGNS LIST
(167) 11 Blood test apparatus
(168) 12 Arm
(169) 12a Artery
(170) 12b Vein
(171) 31 Body
(172) 32, 32-1, 32-2 Light source
(173) 33 Display unit
(174) 34 Belt
(175) 51 Camera array
(176) 52 Signal processing unit
(177) 71 Lens array
(178) 72 Phase difference array
(179) 72a Light shielding portion
(180) 72b Object
(181) 73 Lens array
(182) 74 Imaging element array