Method and apparatus for imaging circadiometer
11503195 · 2022-11-15
Inventors
Cpc classification
G02B26/008
PHYSICS
G01J3/0235
PHYSICS
H04N23/45
ELECTRICITY
H04N23/745
ELECTRICITY
G01J3/465
PHYSICS
H04N23/125
ELECTRICITY
H04N23/90
ELECTRICITY
G01J3/32
PHYSICS
International classification
G02B26/00
PHYSICS
Abstract
A system and method for an imaging circadiometer that measures the spatial distribution of eye-mediated, non-image-forming optical radiation within the visible spectrum.
Claims
1. An imaging circadiometer comprising: one or more optical elements positioned in sequence on an optical axis to image an environment; a photodetector array on the optical axis; a filter wheel having multiple wavelength filters that are individually positionable on the optical axis, the filter wheel positioned between the one or more optical elements and the photodetector array; and a digital image processing unit electrically connected to the photodetector array wherein the multiple wavelength filters include: a wavelength filter with a melanopic spectral transmittance; a wavelength filter with a rhodopic spectral transmittance; a wavelength filter with an erythropic spectral transmittance; a wavelength filter with a chloropic spectral transmittance; and a wavelength filter with a cyanopic spectral transmittance.
2. The imaging circadiometer of claim 1, comprising: a prefiltering optic positioned on the optical axis between the one or more optical elements and the filter wheel to perform beam apodization, shaping, steering or attenuation; or a postfiltering optic positioned on the optical axis between the filter wheel and the photodetector array to perform further beam apodization, shaping, steering or attenuation; or both the prefiltering optic and the postfiltering optic.
3. The imaging circadiometer of claim 1, wherein the multiple wavelength filters include a filter with a neuropic spectral transmittance.
4. The imaging circadiometer of claim 1, comprising another identical imaging circadiometer apparatus positioned so that the two optical axes are separated by a distance.
5. The imaging circadiometer of claim 4, wherein the distance is a human interocular distance or an average human interocular distance.
6. The imaging circadiometer of claim 4, wherein the spectral transmittances of the multiple wavelength filters and spectral responsivities of the photodetector arrays are combined to enable the imaging circadiometer to quantify α-opic distributions of a display or a scene.
7. The imaging circadiometer of claim 6, wherein the display is a virtual reality display or a head-up stereo display.
Description
BRIEF DESCRIPTION OF DRAWINGS
(1)
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
(12)
(13)
(14)
(15)
(16)
(17)
DETAILED DESCRIPTION
(18) Referring to
(19) The optical elements 713 may include any combination of refractive lenses, reflective mirrors, and diffractive optical elements.
(20) A particular function of prefiltering optic 712 or postfiltering optic 704 may be to represent the unequal spatial distribution of both cones and ipRGCs across the retina. The optics may include spatially-varying neutral density filters to model the spatially-varying responsivity. (This function may also be performed as a post-processing step during image analysis, but it would be limited by the image dynamic range. A spatially-varying neutral density filter would preserve the full dynamic range of the images.)
(21) In addition to the spatial distribution of photoreceptors in the human retina, the observer's field of view is also dependent on their facial anthropometry. This issue has long been recognized in the calculation of visual glare by means of the Guth position metric (e.g., CIE 1995. Discomfort Glare in Interior Lighting. CIE 117-1995, and Levin, R. E. 1975. “Position Index in VCP Calculations,” J. Illuminating Engineering Society 4(2):99-105). This reference illustrates the angular distribution of the rods and cones within the human retina, and the observer's sensitivity to visual glare is a function of both the vertical and horizontal angles from the gaze direction (where V/R and L/R are their respective tangents.)
(22) An equivalent to the Guth position index for ipRGC spatial responsivity has yet to be formalized in the academic literature, but it certainly exists as shown by Glickman et al. 2003, Rügger et al. 2005, and others. An imaging circadiometer that measures the influence of optical radiation received by an observer for an unrestricted visual field would therefore require a fisheye lens to generate a hemispherical field of view, and a prefiltering or postfiltering optic to represent the position index spatial responsivity. (Alternatively, this function could be performed in software during image analysis.)
(23) In one embodiment, prefiltering optic 712 or postfiltering optic 704 is comprised of spatially-varying neutral density filters that are specific to the spatial sensitivities of the short-wavelength, medium-wavelength and long-wavelength cones, rods, ipRGCs, and the observer's field of view for neuropsin.
(24) Filter wheel filters 705 through 710 may include any combination of dyed glass or polymers, and thin-film interference filters or antireflection coatings deposited on a transparent substrate.
(25) There is further evidence that different combinations of red and blue light may enhance or suppress the production of melatonin and other hormones that contribute to the entrainment of circadian rhythms in humans and other mammals (e.g., Figueiro, M. G., and M. S. Rea. 2010. “The Effects of Red and Blue Lights on Circadian Variations in Cortisol, Alpha Amylase, and Melatonin,” International Journal of Endocrinology Col. 2010, Article ID 829351). As research into the effects of optical radiation on circadian rhythm entrainment and disruption proceeds, it may be necessary to account for them in the image generation and analysis of the circadiometer.
(26) In the simplest scenario, the effects of colored light may be due to a linear combination of the signals from the long-, medium-, and short-wavelength cones, the rods, and the ipRGCs. If so, then the images recorded through each filter may be combined by means of a linear transform, similar to the “spectral sharpening” techniques used for optimizing the spectral responses of RGB color sensors in digital cameras (e.g., Drew, M. S., and G. D. Finlayson. 2000. “Spectral Sharpening with Positivity,” J. Optical Society of America A 17(8):1361-1370). This could be accomplished in software during image analysis, although it would likely be necessary to take into account the history of previously recorded images in a sequence, as the response time of circadian entrainment and disruption to optical radiation exposure is measured in tens of minutes to hours.
(27) In another scenario however, it is possible that the effects of colored light may be due to electrochemical interactions between the opsins in adjacent photoreceptors. If so, then the effect of colored light may involve changes in their spectral responsivity distributions, such that a linear combination of images recorded through filters 705 to 710 may not accurately represent the effects of the colored light.
(28) In an embodiment then, the filter wheel includes filters whose spectral transmittance correctly accounts for any changes in the spectral responsivity distributions due to electrochemical interactions with colored light. Given that colored light appears to produce both enhancement and suppression of circadian-linked hormones, a linear combination of images may include images that are subtracted rather than added.
(29) In an embodiment for use when measurement speed must be maximized, the filters 706 to 710 assembled within the filter wheel 711 in
(30) An optional sixth imaging sub-system 815 may be added to the embodiment illustrated in
(31) In one embodiment of the invention shown in
(32) Each digital imaging subsystem 910A-E has a different spectral responsivity distribution as determined by the combination of the spectral transmittance of the imaging optics module 940, the spectral transmittance distribution of the optical filter 930, and the spectral responsivity distribution of the imaging sensor 920.
(33) The optical filter 930 may be an inorganic glass filter, an organic polymer filter, a thin film filter, a combination thereof, or any other transparent material with a desired spectral transmittance distribution.
(34) The spectral transmittance distribution of the optical filter 930 may be fixed, or it may be electrically tunable
(35) The optical filter 930 may further incorporate a linear or circular polarizer.
(36) In some embodiments, the imaging sensor 920 may be offset in the x-y plane with respect to the imaging optics axis 950.
(37) The resolution, size, and type of imaging sensor 920 may be different for each imaging subsystem 910A-E. For instance, a sensor used to collect the melanopic radiance may have a lower resolution than one or some of the other sensors. Again, for instance, a sensor with a spectral range in the midinfrared may have a lower resolution than a sensor with a spectral range in the visible region of the spectrum. Similarly the optics module 940 may be different for each imaging subsystem 910A-E. Additionally, image sensor binning strategies may also result in different effective resolutions for each image sensor 920, and specific region sampling strategies may result in different effective sizes for each image sensor 920. For example a binning strategy may include binning 2×2, 3×3, 4×4 . . . n×n pixels, where every n×n pixels within an image are summed, or potentially averaged, thus creating a new image with a new resolution given by Equation 1.
new resolution=original resolution/(n×n) (Eq. 1)
(38) In another embodiment shown in
(39) In another embodiment shown in
(40) As shown in
(41) In another embodiment shown in
(42) As shown in
(43) Optical corrector plate 1410 may be separate from optical filter 1460, or it may be combined into a combination filter and corrector plate. Depending on the dispersion characteristics of the transparent material, it may be necessary to limit the spectral bandwidth of the optical filter to avoid spectral smearing of the image on the sensor plane.
(44) In another embodiment shown in
(45) In one embodiment, optical axes 1560, 1565 are parallel. In another embodiment, the optical axes 1560, 1565 are not parallel and the fields of view of imaging systems 1501, 1505 overlap at some distant focal point. In this latter case, the filters 1520 and 1530 are mounted at a corresponding angle on the rotatable disks 1525, 1535, so that when they are on the respective optical axes they are in planes that are normal to the respective optical axes. As may be readily understood, three or more imaging subsystems may be similarly arranged with common rotatable wheels 1525 and 1535. As may also be readily understood, filters 1520 and neutral density (or clear) filters 1530 rotated into position by rotatable disks may also be positioned along common optical axes 1560 and 1565 via alternative positioning mechanics such as one or more linear translation stages.
(46) In operation, neutral density (or clear) filters 1530 are rotated into position, following which the filters 1520 are rotated into position prior to opening shutters 1540 and 1545 and simultaneously capturing two digital images with image sensors 1550 and 1555. The captured images are processed by analog-to-digital converter and associated electronics modules 1570 and 1575 respectively, then transmitted to a computer system 1580 for further processing or data storage. The computer system comprises one or more processors connected to non-transient computer readable memory in which is stored computer readable data and computer executable instructions. The computer readable instructions are executed by the processor to perform the necessary processing of the captured images and to store and retrieve the data.
(47) In one useful configuration shown in
(48) In
(49) Each imaging subsystem 1720A-E may further include a plenoptic (aka “light field”) imaging subsystem, wherein the depth of field and target plane can be determined a posteriori using computational photography techniques, thereby obviating the need for autofocus capabilities.
(50) In
(51) In another embodiment, an optical flicker sensor (not shown) can be mounted parallel to the z-axis. In some embodiments the optical flicker sensor is included, but not mounted parallel to the z-axis. The optical flicker sensor may be used to determine an optimal set of exposure times to be used by the imaging subsystems 1820A-E.
(52)
(53)
(54) In another embodiment of
(55) Referring again to
(56) In the embodiment shown in
(57) These two images are composited into a single two-layer image 2250, i.e. “stacked” to generate a multispectral image. Image 2210 is subjected to a two-dimensional projective mapping projection, in other words a “keystone correction”, so that it is registered with image 2220. The resulting image 2250 shows that the image portion 2230 has been differentially stretched vertically from shape 2230 into a rectangular shape that matches image portion 2240 and registered image portion 2260. Assuming that the imaging subsystems introduce only sub-pixel geometric distortion, in an ideal case there will be a one-to-one correspondence between the pixels of the two layers of image portion 2260. In practice it may be difficult to register images to within several pixels or less due to focus or resolution limitations, ability to accurately locate the common portions in the images, and lens distortion among other factors.
(58) The alignment of the images may include translation, rotation, keystone and magnification adjustments to one or more images, so as to register imaged objects in the same location within the multi-layered image. The images are intentionally overlapped to result in a multi-layer registered image that does not cover an area larger than any of the areas from the individual imaging subsystems.
(59) In general, an imaging subsystem whose optical axis is oblique to the plane of the imaged object must be calibrated in order to determine the necessary parameters for keystone correction. For each input image pixel with horizontal and vertical coordinates x, y, the transformation to output image pixel with horizontal and vertical coordinates x′, y′ is the rational linear mapping:
x′=(ax+by+c)/(gx+hy+1),y′=(dx+ey+f)/(gx+hy+1) (Eq. 2)
where a, b, c, d, e, f, g, and h are constants to be determined.
(60) To perform the calibration, four fiducial marks (ideally representing a square) are positioned on the object to be imaged. An image is captured, and the coordinates of the pixels representing four fiducial marks are designated (x.sub.0, y.sub.0), (x.sub.1, y.sub.1), (x.sub.2, y.sub.2), and (x.sub.3, y.sub.3). As shown by Heckbert, P., 1999, “Projective Mappings for Image Warping,” University of California Berkeley Computer Science Technical Report 15-869, the above constants are given by:
Δx.sub.1=x.sub.1−x.sub.2,Δy.sub.1=y.sub.1−y.sub.2 (Eq. 3)
Δx.sub.2=x.sub.3−x.sub.2,Δy.sub.2=y.sub.3−y.sub.2 (Eq. 4)
Σx=x.sub.0−x.sub.1+x.sub.2−x.sub.3,Σy=y.sub.0−y.sub.1+y.sub.2−y.sub.3 (Eq. 5)
g=(ΣxΔy.sub.2−ΣyΔx.sub.2)/(Δx.sub.1Δy.sub.1Δx.sub.2) (Eq. 6)
h=(Δx.sub.1Σy−Δy.sub.1Σx)/(Δx.sub.1Δy.sub.2−Δy.sub.1Δx.sub.2) (Eq. 7)
a=x.sub.1−x.sub.0+gx.sub.1,d=y.sub.1−y.sub.0+gy.sub.1 (Eq. 8)
b=x.sub.3−x.sub.0+hx.sub.3,e=y.sub.3−y.sub.0+hy.sub.3 (Eq. 9)
c=x.sub.0,f=y.sub.0 (Eq. 10)
(61) Keystone correction is applied to one or more of the images captured by the embodiment shown in
(62) Once the necessary image transformations have been determined through calibration for each imaging subsystem of the imaging circadiometer, the transformations must be applied to each captured image. Equation 2 is executed in parallel, e.g. using multithreaded operations on a multicore processor, field programmable gate array (FPGA), or with a massively-parallel graphics processing unit (GPU).
(63) For some applications, it may be necessary to downscale or upscale one or more images using known image processing techniques. For example, it may be necessary to downscale images in order to achieve image registration with images generated by the image sensor with the lowest resolution, or conversely upscale images to achieve image registration with images generated by the image sensor with the highest resolution.
(64) It may also be an advantage to downscale images by means of pixel binning when performing measurements for various α-opic metrics. For example, the resolution of the human eye is greater for green light than it is for blue light. Consequently, a full resolution image could be used for the scotopic measurements, while pixel binning could be employed to generate reduced resolution images for the S-cone and L-cone images. The advantages of such images include lower image storage requirements and increased image transmission and processing speeds, without sacrificing significant accuracy.
(65)
(66) In step 2310, the calibrated digital imaging subsystems are used to capture N spectrally-bandwidth-limited images, for example S-cone, M-cone, L-cone, scotopic, and ipRGC images.
(67) In step 2320, one or more of the N images may optionally be scaled such that all images have the same horizontal and vertical pixel resolution.
(68) In step 2330, keystone correction according to Equation 2 may be applied as required to one or more of the N images in order to facilitate image registration and stacking.
(69) In step 2340, one or more of the N images may be optionally offset vertically and/or horizontally, magnified and/or rotated in order to achieve per-pixel alignment of the target portions of the images. For example, the target portion may be the display area of an LCD screen.
(70) In step 2350, the N separate images are combined (or “stacked”) into a single multispectral image using a suitable image file format.
(71) In step 2360, per-pixel image metrics are calculated using the multispectral image data.
(72) Steps 2320-2360 are performed by a computer, such as computer 1580 or 1690.
(73) Throughout the description, specific details have been set forth in order to provide a more thorough understanding of the invention. However, the invention may be practiced without these particulars. In other instances, well known elements have not been shown or described in detail and repetitions of steps and features have been omitted to avoid unnecessarily obscuring the invention. Accordingly, the specification is to be regarded in an illustrative, rather than a restrictive, sense.
(74) The detailed description has been presented partly in terms of methods or processes, symbolic representations of operations, functionalities and features of the invention. These method descriptions and representations are the means used by those skilled in the art to most effectively convey the substance of their work to others skilled in the art. A software implemented method or process is here, and generally, understood to be a self-consistent sequence of steps leading to a desired result. These steps require physical manipulations of physical quantities. Often, but not necessarily, these quantities take the form of electrical or magnetic signals or values capable of being stored, transferred, combined, compared, and otherwise manipulated. It will be further appreciated that the line between hardware and software is not always sharp, it being understood by those skilled in the art that the software implemented processes described herein may be embodied in hardware, firmware, software, or any combination thereof. Such processes may be controlled by coded instructions such as microcode and/or by stored programming instructions in one or more tangible or non-transient media readable by a computer or processor. The code modules may be stored in any computer storage system or device, such as hard disk drives, optical drives, solid state memories, etc. The methods may alternatively be embodied partly or wholly in specialized computer hardware, such as an application specific integrated circuit (ASIC) or FPGA circuitry.
(75) It will be clear to one having skill in the art that further variations to the specific details disclosed herein can be made, resulting in other embodiments that are within the scope of the invention disclosed. Two or more steps in the flowchart may be performed in a different order, other steps may be added, or one or more may be removed without altering the main function of the invention. Electronic modules may be divided into constituent modules or combined into larger modules. All parameters, dimensions, materials, and configurations described herein are examples only and actual choices of such depend on the specific embodiment. Accordingly, the scope of the invention is to be construed in accordance with the substance defined by the claims.
(76) Claim Support
(77) Disclosed is an imaging circadiometer comprising an imaging optical system, an optional beam shaping or attenuation optic, a filter wheel, a two-dimensional photodetector array, and a digital image processing unit.
(78) In some embodiments, the filter wheel includes optical filters matching five or six of the spectral transmittance distributions: S-cone, M-Cone, L-Cone, Scotopic, ipRGC, Neuropic.
(79) Disclosed is an imaging circadiometer comprising: one or more optical elements positioned in sequence on an optical axis to image an environment; a photodetector array on the optical axis; a filter wheel having multiple filters that are individually positionable on the optical axis, the filter wheel positioned between the one or more optical elements and the photodetector array; and a digital image processing unit electrically connected to the photodetector array.
(80) In some embodiments, the imaging circadiometer comprises a prefiltering optic positioned on the optical axis between the one or more optical elements and the filter wheel to perform beam apodization, shaping, steering or attenuation; or a postfiltering optic positioned on the optical axis between the filter wheel and the photodetector array to perform further beam apodization, shaping, steering or attenuation; or both the prefiltering optic and the post filtering optic.
(81) In some embodiments, the multiple filters include: a filter with a melanopic spectral transmittance; a filter with a rhodopic spectral transmittance; a filter with an erythropic spectral transmittance; a filter with a chloropic spectral transmittance; and a filter with a cyanopic spectral transmittance.
(82) In some embodiments, the multiple filters include a filter with a neuropic spectral transmittance.
(83) In some embodiments, another identical imaging circadiometer apparatus is positioned so that the two optical axes are separated by a distance.
(84) In some embodiments, the distance is a human interocular distance or an average human interocular distance.
(85) In some embodiments, spectral transmittances of the multiple filters and spectral responsivities of the photodetector arrays are combined to enable the imaging circadiometer to quantify α-opic distributions of a display or scene.
(86) In some embodiments, the display is a virtual reality display or a head-up stereo display.
(87) Also disclosed is an imaging circadiometer comprised of two or more arrangements of optical components, wherein: a first arrangement of optical components comprises: one or more imaging lenses; five or more filters; one or more neutral density filters; a mechanical or electro-optic shutter; and a digital image sensor; the one or more imaging lenses, the mechanical or electro-optic shutter and the digital image sensor are aligned on an optical axis; the five or more filters are individually positionable on the optical axis; the one or more neutral density filters are individually positionable on the optical axis; a second arrangement of optical components comprises: one or more further imaging lenses; the five or more filters; the one or more neutral density filters; a further mechanical or electro-optic shutter; and a further digital image sensor; the one or more further imaging lenses, the further mechanical or electro-optic shutter and the further digital image sensor are aligned on a further optical axis; the five or more filters are individually positionable on the further optical axis; and the one or more neutral density filters are individually positionable on the further optical axis.
(88) In some embodiments, the imaging circadiometer comprises a first mechanically rotatable disk on which the five or more filters are mounted; and a second mechanically rotatable disk on which the one or more neutral density filters are mounted.
(89) In some embodiments, there are three or more arrangements of optical components; and the five or more filters are individually positionable on an optical axis of each of a third or more of the three or more arrangements of optical components; and the one or more neutral density filters are individually positionable on the optical axis of each of the third or more of the three or more arrangements of optical components.
(90) In some embodiments, the optical axis is parallel to the further optical axis; or the optical axis is not parallel to the further optical axis and the two or more arrangements of optical components have fields of view that overlap.
(91) In some embodiments, the imaging circadiometer comprises a first mechanically rotatable disk on which the five or more filters are mounted; and a second mechanically rotatable disk on which the one or more neutral density filters are mounted; wherein: the optical axis is not parallel to the further optical axis; and the five or more filters and the one or more neutral density filters are mounted at a corresponding angle on the mechanically rotatable disks.
(92) In some embodiments, the imaging circadiometer comprises a linear translation stage on which the five or more filters are mounted; and a further linear translation stage on which the neutral density filters are mounted.
(93) In some embodiments, five of the five or more filters are different α-opic filters.
(94) In some embodiments, one of the five or more filters is a neuropic spectral response filter.
(95) In some embodiments, the digital image sensors have different resolutions, sizes or types; or at least one digital image sensor is offset from its optical axis; or the arrangements of optical components have a common alignment axis; or the arrangements of optical components have a common point of focus.
(96) In some embodiments, the arrangements of optical components have a common point of focus; and at least some of the digital image sensors are tilted with respect to their optical axes in accordance with the Scheimpflug condition.
(97) In some embodiments, one or more of the arrangements of optical components include a Scheimpflug normalizer prism.
(98) In some embodiments, the imaging circadiometer comprises a laser range finder, wherein each arrangement of optical components autofocuses on an object plane at a distance indicated by the laser range finder; a spectroradiometer that improves a measurement accuracy of each arrangement of optical components; an optical flicker sensor configured to determine an exposure time for the arrangements of optical components; or in each arrangement of optical components, a plenoptic imaging subsystem for determining a depth of field and a target plane of the arrangement of optical elements using computational photographic imaging.
(99) Disclosed is an imaging circadiometer comprising: one or more optical elements positioned in sequence on an optical axis to image an environment; a photodetector array on the optical axis; a filter wheel having multiple filters that are situated to be individually positioned on the optical axis, the filter wheel positioned between the one or more optical elements and the photodetector array; and a digital image processing unit electrically connected to the photodetector array.
(100) Disclosed is an imaging circadiometer comprised of two or more arrangements of optical components, wherein: a first arrangement of optical components comprises: one or more imaging lenses; five or more filters; one or more neutral density filters; a mechanical or electro-optic shutter; and a digital image sensor; the one or more imaging lenses, the mechanical or electro-optic shutter and the digital image sensor are aligned on an optical axis; the five or more filters are situated to be individually positioned on the optical axis; the one or more neutral density filters are situated to be individually positioned on the optical axis; a second arrangement of optical components comprises: one or more further imaging lenses; the five or more filters; the one or more neutral density filters; a further mechanical or electro-optic shutter; and a further digital image sensor; the one or more further imaging lenses, the further mechanical or electro-optic shutter and the further digital image sensor are aligned on a further optical axis; the five or more filters are situated to be individually positioned on the further optical axis; and the one or more neutral density filters are situated to be individually positioned on the further optical axis.
(101) In some embodiments the imaging circadiometer comprises a first mechanically rotatable disk on which the five or more filters are mounted; and a second mechanically rotatable disk on which the one or more neutral density filters are mounted.
(102) In some embodiments there are three or more arrangements of optical components; and the five or more filters are situated to be individually positioned on an optical axis of each of a third or more of the three or more arrangements of optical components; and the one or more neutral density filters are situated to be individually positioned on the optical axis of each of the third or more of the three or more arrangements of optical components.