Temporal-spectral multiplexing sensor and method

11047737 · 2021-06-29

Assignee

Inventors

Cpc classification

International classification

Abstract

A temporal-spectral multiplexing sensor for simultaneous or near simultaneous spatial-temporal-spectral analysis of an incoming optical radiation field. A spectral encoder produces a time series of spectrally encoded optical images at a high sampling rate. A series of full panchromatic spectrally encoded optical images are collected at a rate similar to the sampling rate. A detector records at least one spatial region of the spectrally encoded optical image. A processor is configured to process two series of spectrally encoded optical images to produce an artifact-free spectral image. The processing includes using the panchromatic images to normalize the spectrally encoded images, and decoding the normalized encoded images to produce high fidelity spectral signatures, free of temporal artifacts due to fluctuations at frequencies slower than the sampling rate for polychromatic images.

Claims

1. A temporal-spectral multiplexing sensor for simultaneous or near simultaneous spatial-temporal-spectral analysis of an incoming optical radiation field, comprising: a spectral encoder that produces a time series of spectrally encoded optical images at a high sampling rate; means of collecting a series of full panchromatic spectrally encoded optical images at a rate similar to the sampling rate, a detector that records at least one spatial region of the spectrally encoded optical image; and a processor that is configured to process two series of spectrally encoded optical images to produce an artifact-free spectral image, wherein the processor is configured to: use the panchromatic images to normalize the spectrally encoded images, and decode the normalized encoded images to produce high fidelity spectral signatures, free of temporal artifacts due to fluctuations at frequencies slower than the sampling rate for polychromatic images.

2. The temporal-spectral multiplexing sensor of claim 1, where simultaneous spatial-temporal analysis comprises the co-processing of near-simultaneous and co-registered hypertemporal and hyperspectral imagery to monitor, detect, identify, or characterize objects or events based on the combination of spectral and temporal signatures.

3. The temporal-spectral multiplexing sensor of claim 2, wherein the panchromatic signal is analyzed for its temporal content in order to reveal the frequency components of temporal signal oscillations due to changes in the object scene, such as mechanical vibrations or source emission changes, or to suppress line-of-sight motion induced signal variance.

4. The temporal-spectral multiplexing sensor of claim 2, wherein the image-to-image correlation between panchromatic images are used to generate a clutter mitigation projection operator that corrects the panchromatic and spectral imagery by projecting the images into a clutter-free subspace.

5. The temporal-spectral multiplexing sensor of claim 2, wherein object detection and/or characterization is based on combined HTI/HSI signatures, and where the analysis is performed by calculating power spectral density.

6. The temporal-spectral multiplexing sensor of claim 2, wherein at least one of object detection or characterization is based on combined HTI/HSI signatures, and where the analysis is performed by principal component analysis.

7. The temporal-spectral multiplexing sensor of claim 2, wherein at least one of object detection or characterization is based HTI analysis of a specific spectral band or combination of spectral bands.

8. The temporal-spectral multiplexing sensor of claim 7, wherein at least one of object detection or characterization comprises atmospheric absorption compensation for intensity-varying backgrounds.

9. The temporal-spectral multiplexing sensor of claim 2, wherein at least one of object detection or characterization comprises whitening of the spectral content prior to application of a detection operator, such as the ACE (Adaptive Cosine Estimator), or other projection operators.

10. The temporal-spectral multiplexing sensor of claim 2, wherein at least one of object detection or characterization comprises whitening of the temporal content prior to application of a detection operator.

11. The temporal-spectral multiplexing sensor of claim 2 wherein the panchromatic image stream is analyzed for its temporal content in order to reveal temporal frequency components of the signal, to suppress line-of-sight motion induced signal variance effect on HSI data fidelity, to capture high fidelity spectral signatures from targets with temporally unstable radiance or to perform object detection and/or characterization based on combined HTI/HSI signatures.

12. The temporal-spectral multiplexing sensor of claim 1, where the spectral encoder comprises: a system of alternately applying a selected encoding transform and the complement of the selected encoding transform simultaneously or sequentially; and the panchromatic images are produced by summation of the signals generated by the encoding transform with the signal generated by the complement of the encoding transform.

13. The temporal-spectral multiplexing sensor of claim 12, where the encoded image is captured by one detector and the complement of the encoded image is captured by another detector.

14. The temporal-spectral multiplexing sensor of claim 1, where the means of collecting the panchromatic image includes alternating spectrally encoded images with un-encoded, panchromatic images.

15. The temporal-spectral multiplexing sensor of claim 14, where the spectral encoder applies a series of spectral band pass filters.

16. The temporal-spectral multiplexing sensor of claim 1, wherein using the panchromatic images to normalize the spectrally encoded images comprises applying a clutter mitigation projection operator, generated from the panchromatic images to the spectrally encoded images prior to decoding.

17. The temporal-spectral multiplexing sensor of claim 1 that allows capture high fidelity spectral signatures from targets with temporally unstable radiance.

18. The temporal-spectral multiplexing sensor of claim 1 where the spectral encoder comprises: a first polychromator that disperses the light to form a dispersed spectral image on a spatial light modulator, encoding a set of spectral bands by actuating specific areas of the spatial light modulator, a second polychromator that is configured to recombine the light; and a detector that is configured to form a spectrally encoded image.

19. The temporal-spectral multiplexing sensor of claim 1 where the detector comprises a single-element detector and captures a single spatial element.

20. The temporal-spectral multiplexing sensor of claim 1 where the detector comprises a linear array of optical detectors and captures a linear array of spatial elements.

21. The temporal-spectral multiplexing sensor of claim 1 where the detector comprises a two-dimensional array of optical detectors which capture a two-dimensional image.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

(1) Other objects, features and advantages will occur to those skilled in the art from the following description of the preferred and alternative embodiments of the invention, and the accompanying drawings, in which:

(2) FIG. 1 is a schematic of the hardware and software for a preferred embodiment using a dispersive transform spectral imager.

(3) FIG. 2 is a schematic of a second hardware embodiment of a dispersive transform imager using multiple detectors to simultaneously record a spectral image and its compliment.

(4) FIG. 3 is a schematic of a third hardware embodiment using fixed bandpass spectral filter.

(5) FIG. 4 is a schematic of the software block diagram showing the first method to generate and capture artifact-free spatial and temporal images with a dispersive transform spectral imager.

(6) FIG. 5 is a schematic of the software block diagram showing the second method to generate and capture artifact-free spatial and temporal images with a dispersive transform spectral imager.

(7) FIG. 6 is a schematic of the software block diagram showing the method to generate and capture artifact-free spatial and temporal images with a bandpass filter-based spectral imager.

DETAILED DESCRIPTION

(8) FIG. 1 shows a schematic layout of the hardware and software for a dispersive transform imager that produces a series of spectrally encoded images and processes those images to produce HTI and HSI data. FIG. 1 uses a single FPA detector to collect the encoded images. FIG. 2 shows an alternative hardware embodiment of a dispersive transform imager with two focal plane array detectors. FIG. 3 shows an alternative embodiment for a sensor based on fixed band-pass spectral filters.

(9) The dispersive transform embodiments of FIGS. 1 and 2 use a spectrometer to produce a dispersed image on a spatial light modulator that encodes a series of images using a series of spectrally-selective masks. In standard Hadamard transform encoding, each individual mask produces an encoded polychromatic image comprising the contribution of approximately half of all spectral bands the instrument collects. In each subsequent frame, a different set of approximately half of the bands is present in the image, and the process continues until the number of captured frames is equal to the number of spectral bands, N, at which time, a spectrum is recoverable for each pixel by weighted sums of the images. Even for temporally stationary input, polychromatic images vary and the sequence of captured images flickers.

(10) The hardware of FIGS. 1 and 2 are used to collect complementary images containing complimentary sets of spectral bands that sum together to make a panchromatic image. Summation of a pair of such complementary-encoded images negates the encoding and produces a true temporally and spatially correct panchromatic image free of encoding artifacts. Such image can be successfully analyzed by HTI temporal analysis techniques. This panchromatic image can also be used to normalize the individual spectrally encoded frames prior to decoding, in order to remove spurious spectral features due to intensity fluctuations during the measurement cycle. Complementary images can be collected sequentially, using the hardware of FIG. 1, or captured simultaneously, on two different focal plane arrays (FPA) using the hardware of FIG. 2. In the latter case, the spatial light modulator would be a reflective, Digital Micromirror Array (DMA), in which each element of the array can be tilted to send the light in one of two directions, one towards FPA1 and the second towards FPA2.

(11) The basic software structure and method of operations is common to all hardware embodiments. In each case, software running on a processor takes a sequence of images each encoded with a different spectral bandpass, and decodes the images to produce HSI and HTI data. It then combines the data to detect, characterize and classify the objects in the image.

(12) The software block diagram shown on FIG. 4 outlines this first method for the generation and capturing artifact-free spatial and temporal images with a dispersive transform spectral imager. This method is based on generating and capturing a set of complementary-encoded polychromatic images for each Hadamard mask, and summing the two complements to obtain the polychromatic image. Image encoding starts with the selection of the encoding transform, followed by executing an algorithm that generates the encoding spatial mask and transfers it to the SLM. In the case of Hadamard transform, encoding is accomplished by using a Simplex (S) matrix containing digital ones and zeros. The complement matrix is created by replacing all the ones in the original matrix by zeros, and all the zeros by ones. A series of binary masks corresponding to the encoding transform and the complement of the selected transform are generated as described above and applied to the SLM. Each “mask” selects a specific combination of spectral bands as prescribed by the encoding transform, as taught in e.g. Goldstein et al., U.S. Pat. Nos. 7,324,196 and 8,305,575. The mask sequence is applied repetitively while recording a series of image frames, one for each applied mask and its compliment. From the generated image stream, a series of artifact-free, panchromatic spatial images is created by summation of complementary images as described above, and the resulting data are analyzed by HTI processing algorithms. The spectrally encoded images are normalized to the total intensity in the summed compliments, and then decoded using the inverse transform, to produce spectral imagery that is fed to the HSI processing algorithms. HSI and HTI data are updated periodically, based on the last N set of encoded compliments. The HSI data may be updated on every frame, and is fully refreshed every N frames.

(13) For the hardware of FIG. 2, both complements are acquired simultaneously, and a unique set of spectral data is collected for every N exposures. For the hardware of FIG. 1, the compliments are introduced sequentially, which doubles the time taken to generate an entire Hadamard encoded sequence for spectral analysis; a unique set of data is collected for every 2N exposures. In this case, however, two valid spectrally encoded sequences are created in that time interval, thereby improving the signal-to-noise ratio by 2.sup.1/2 due to signal averaging.

(14) The second method that may be used for generation and capturing artifact-free spatial and temporal images with a dispersive transform spectral imager is illustrated in the software block diagram shown on FIG. 5. In this case we do not use a complement mask, but rather insert a mask with all micromirrors in the “on” position, producing interleaved, unfiltered spectral images. This approach produces essentially equivalent information to the first method, as the complement of the spectral image can then be generated by subtracting the spectral image from the “on” image. In contrast to method 1, the unfiltered spatial image is produced directly, and need not be synthesized from the two complements. All other data processing proceeds as in method 1. The spectral results are the same, but only ½ as many spectral measurements are made per unit time.

(15) The third method to perform spectral/temporal data acquisition applies to a spectrometer that uses fixed spectral bandpass filters instead of transform-based encoding. See FIG. 3. In this case, the instrument operates by sequentially inserting bandpass filters in the image plane of the sensor resulting in the creation of a MultiSpectral Imaging (MSI) data stack. The data processing for this embodiment is shown in FIG. 6. The data stack may contain unfiltered images in addition to the filtered ones in order to provide additional images to monitor the overall image intensity. The unfiltered images can be inserted between each filtered image, or on a periodic basis to produce a set of m unfiltered images for every N filtered images, where m<N. Image processing otherwise proceeds as in method 2 (FIG. 5). HTI analysis can be performed on filtered or unfiltered images as above, or on spectrally filtered images, providing that filtered images are collected with the same filter. The higher signal level in unfiltered images may be used for HTI analysis to gain better sensitivity. The unfiltered images may also be used to normalize the filtered images in an analogous manner to the Hadamard-encoded imagery.

(16) One of the most stressing cases for a spectrometer is recovering a spectrum with intensity variations during the sampling interval. For example, intensity variations may result from changes in source emittance, illumination conditions, reflectance properties, turbulence, or sensor motion. The effect on the measured spectrum is an additional signal, termed clutter, that obscures the spectrum. The sampling rate of a standard spectrometer is the reciprocal of the time required to generate a spectrum, and it sets a limit on the sample stability required to avoid unacceptable clutter levels.

(17) In standard time-multiplexed spectrometers, accurate measurement of a spectrum requires it to be stable, while intensity versus wavelength is measured by taking a series of N samples. Herein we disclose a technique that reduces the stability requirement from the time required to take the full spectrum to the time required to take one sample (embodiment FIG. 2) or two samples (embodiments of FIGS. 1, and 3). During the same time, a temporal signal is recovered that is free from encoding artifacts typically present when recording spectra in a time series of samples. This signal is used to correct the spectrally encoded signals for intensity fluctuations. For instance, for a spectrum with 100 individual spectral channels the sample stability requirement is reduced by a factor of 50 (embodiments of FIGS. 1 and 3) or 100 (embodiment 2) and the temporal signal is recovered at half the sampling rate (FIGS. 1 and 3) or by a factor of 100 (embodiment FIG. 2).

(18) A brief outline of the mathematics behind the procedure is presented here using integral transformation notation with a specific example presented afterwards in matrix notation. The description is general to spectrometers using any encoding transform including, the Hadamard Transform and Fourier Transform.

(19) In standard spectrometers applicable to a stable signal, a spectrum f(λ) is encoded into a function of time g(t) by applying a transformation kernel, which is a series of masks M(t, λ),
g(t)=∫M(t,λ)f(λ)
Note that g(t) is the measured quantity and we would like to recover f(λ). g(t) is a temporal measurement with variations due to the kernel. The spectrum does not depend on time; therefore, g(t) does not contain HTI information about the source being measured. In Hadamard spectroscopy M(t, λ) is known as an S-matrix and in scanning or filter-based instruments M(t, λ) is a series of band passes. The process is inverted to recover the spectrum,
{circumflex over (f)}(λ)=∫M.sup.−1(t,λ)g(t)dt
where {circumflex over (f)}(λ) is the recovered value of f(λ).

(20) This procedure is applicable when the spectrum f(λ) is only a function of Δ. It does not work well for time varying signals, because in such cases the spectrum s(t, λ) is a function of both time and wavelength. The time dependence passes through the transformation and corrupts the result. For the time dependent case, we employ the product solution,
s(t,λ)=I(t)f(λ)
g(t)I(t)=∫M(t,λ)s(t,λ)dλ=∫M(t,λ)I(t)f(λ)dA

(21) The measured quantity now varies with time. We see that s(t, λ) is the product of two data dimension of interest, {circumflex over (f)}(λ) for HSI analysis and Î(t) for HTI analysis. To separate these data dimensions, we introduce the innovation of the method of complementary matrices. When complementary matrices are measured, we can recover the integral over wavelength versus time Î(t). This results in the three fundamental equations of the combined HTI/HSI method. The first two provide the decoupled HTI data dimension Î(t) and the equation defining the complementary matrix,
Î(t)=I(t)∫f(λ)dλ=∫(M(t,λ)+M.sup.c(t,λ))I(t)f(λ)dλ=g(t)+g.sup.c(t) where M(t, λ)+M.sup.c(t, λ)=1
M.sup.c(t, λ) is the series of complement masks, and g.sup.c(t) is the transformation with the complement masks. Thus, we have introduced g.sup.c(t) as an additional measured quantity. This is used to scale (normalize) the function of wavelength during inversion removing the time dependence. The result is the decoupled HSI data dimension {circumflex over (f)}(λ),
{circumflex over (f)}(λ)=∫M.sup.−1(t,λ)g(t)I(t)I(t).sup.−1dt=∫M.sup.−1(t,λ)g(t)dt
where I(t).sup.−1=∫f(λ)dλ/Î(t) and Î(t)Î(t).sup.−1=1. When g(t)+g.sup.c(t) are obtained at slightly different times then these equalities are approximate. For the embodiment represented on FIGS. 1 and 3 and the methods shown in FIGS. 4 and 5, g(t) and g.sup.c(t) are sequential polychromatic frames, and the approximation is valid if the time variation is minimal between sequential frames.

(22) Finally, decoupling the dimensions requires the solution of the second equation defining the complementary matrix, which may be done when any of the two terms are measured. For instance, the instrument may measure g(t) and 1 where 1 corresponds to a measurement excluding the transformation kernel. An example of using 1, is embodied in FIG. 2 and FIG. 4. Systems like the one embodied in FIG. 2 only need to use a time independent transformation kernel to separate the time and wavelength dependence.

(23) A mathematical embodiment of the invention consistent with the methods represented in the embodiment of FIG. 1-2 and methods of FIGS. 4 and 5 are presented here, which provides an example of using the above equations.

(24) We start with a standard Hadamard Imager, which does not take advantage of the combined HTI/HSI processing of this invention. For an imaging sensor pixel over n integration time periods, we may define the vectors of g.sub.t (polychromatic, discrete time), f.sub.λ (discrete wavelength), and e (error) having dimensions of wavelengths×1. A matrix equation relates them through the S-matrix transformation (specific embodiment derived from the Hadamard transformation),
g.sub.t=S.sub.tλf.sub.λ+e
which is a matrix representation of the integral transformation equation with S being a specific example of the more general transformation M(t, λ). In the FIG. 4 embodiment, the transformation is done in hardware using a grating to disperse the light, a DMA to perform the transformation, and a grating to recombine the light. The process is repeated until the number of imager frames collected equals the number of spectral bands (i.e. the vectors can also be written as square matrices). Solving for the true spectrum {circumflex over (f)}.sub.λ by neglecting the error yields,
{circumflex over (f)}.sub.λ=S.sub.λt.sup.−1g.sub.t
Combining the two equations gives,
{circumflex over (f)}.sub.λ=f.sub.λ+S.sub.λt.sup.−1e
where the error in the spectrum is,
S.sub.λt.sup.−1e={circumflex over (f)}.sub.λ−f.sub.λ
The matrix S.sup.−1 is composed of 0 and 1, scaled by 2/(n+1). Thus, the mean square error for a spectral band is the variance multiplied by the 2/(n+1) or 2σ.sup.2/(n+1). For n bands, the mean square error improvement factor over a single slit spectrometer is n/4 or the signal-to-noise ratio is increased by a factor of ˜√{square root over (n)}/2.

(25) For the embodiment shown in FIGS. 1 and 2, and method 4 of this invention two sets of polychromatic images are generated.
g.sub.2t.sub.j=S.sub.2t.sub.j.sub.λf.sub.λ and g.sub.2t.sub.j+1=(1−S.sub.2t.sub.j+1.sub.λ)f.sub.λ where {t.sub.j|t.sub.j=jΔt,j=0,2,4, . . . ,2(N.sub.λ−1)},
t.sub.j is the start of an integration time period, Δ t is the integration time, N.sub.λ is the number of wavelengths, and 1 is a matrix of all 1's (not the unit matrix). The error terms have been neglected to simplify the discussion. As before, the spectra are obtained by inversion. What makes this process uniquely beneficial is the summation of the two sets of equations,
g.sub.2t.sub.j+g.sub.2t.sub.j+1=S.sub.2t.sub.j.sub.λf.sub.λ+(1=S.sub.2t.sub.j+1.sub.λ)f.sub.λ.sub.t.sub.j
yields a set of N.sub.λ images Î.sub.t.sub.j. For the case of sequential measurements, the intensities, Î.sub.t.sub.j, are approximations to the integrated band images over twice the integration time period capturing ½ of the photon flux (1's in the transformation matrices), and Δ{acute over (t)}=2Δt. If both transformations S and (1−S) are done at the same time (FIG. 2), then the approximation becomes an equality and the full photon flux is utilized and the Δ{acute over (t)}=Δt. This is true because the rejected photons of one transformation, e.g. the zero elements of S, are the accepted photons of the other transformation, e.g. the one elements of (1−S), which corresponds to the embodiment shown in FIG. 6. For this embodiment, the matrix is symmetric, which means that once enough measurements are taken to invert the transformation (square matrix, i.e. the number of polychromatic images equals the number of wavelengths) then the inversion can be done after each additional measurement of the vector and complement pair. This yields an updated spectrum at the data rate of 1/Δ{acute over (t)}. (Full frame rate for simultaneous measurements, and half the frame rate for sequential measurements.)

(26) The new transformation enables novel subspace hyperspectral/hypertemporal imaging techniques. For example, the topic of subspace hyperspectral imaging typically involves techniques that use hyperspectral images to define a vector space basis (via PCA or endmembers). The subspace is then a projection of reduced dimensionality in this basis (i.e. there are more pixels in the scene image than unique materials). Here we use a projection operator, which is generated from the summed complementary images (or a transformation free image), to generate a subspace in a vector space determined from temporal variances. The operator effectively describes temporal artifacts caused by line-of-sight motion “jitter clutter” and sensor noise, without being required to capture spectral variability or variability due to the multiplexing encoding.

(27) Here we demonstrate how to use the vector space derived from temporal variance to improve the recovery of spectra with a method only applicable to the unique data generated by the hardware described herein. A set of clutter mitigated images L is generated by applying the projection operator to G matrices created from vectors g.sub.2t.sub.j (G.sub.e, even) and g.sub.2t.sub.j+1 (G.sub.o, odd) with decoding using S.sup.−1 and (1−S).sup.−1,
L=G.sub.eS.sup.−1P+G.sub.o(1−S).sup.−1P, where P=aa.sup.T
a is any subset of the eigenvectors of the covariance matrix (the choice is application dependent). P and the decoding matrices do not commute. The projector has row-column-dimensions of the number of summed complementary images and the encoded images have row-column-dimensions of the number of wavelengths. By design these dimensions are equivalent. L is the projection of the original set of encoded images in the reduced subspace defined by the projector.

(28) To demonstrate how HSI analysis is performed with this novel clutter rejection we will use a standard projection method known as ACE (Adaptive Cosine Estimator). The technique gets is name because detections are based on the cosine of the angle between the background and target in the detection hyperspace. ACE usually refers to using the squared cosine. Here, without loss of generality, we will use the cosine.

(29) To use ACE we need two vectors, the background and the target. These vectors can be written in terms of reflectance or radiance. L is an ordered set of background vectors, defined above, which are projections into a clutter mitigation subspace (defined above). They have been mean subtracted but still require whitening. The whitened background is,
{tilde over (L)}=LC.sub.L.sup.− 1/2 and C.sub.L.sup.−½=A.sub.C.sub.LΛ.sup.−½A.sub.C.sub.L.sup.T
where C.sub.L is the covariance matrix, C.sub.L.sup.−½ is the whitening operator A.sub.C.sub.L is the matrix of covariance matrix eigenvectors, and Λ is the diagonal matrix of eigenvalues. The target vector is treated in an equivalent way,
{tilde over (s)}=sPC.sup.−1/2

(30) A known target vector (with ground reflectance, atmospheric transmittance, etc.) is projected into the clutter mitigation subspace (P) and then transformed with the whitening operator (C.sup.−½).

(31) The ACE detector is,

(32) A C E = s ~ l ~ s ~ T s ~ l ~ T l ~
which is the cosine of the angle between the vectors. {tilde over (l)} is a single vector taken from {tilde over (L)}.

(33) The result is a Hyperspectral-based detection, with reduced clutter due to temporal variations in the scene. Clutter induced-biasing of the spectral signal is either eliminated (embodiment 2, method 4) or effectively limited to a frequency range of 1/Δ{acute over (t)}=2Δt for the embodiment of FIG. 1 and methods 4-5.

DESCRIPTION OF THE PREFERRED EMBODIMENTS

(34) The sensor technology described herein may utilize Dispersive Transform Imaging Spectrometer (DTIS) technology as described in Vujkovic-Cvijin, P., N. Goldstein, M. J. Fox, S. D. Higbee, S. Latika, L. C. Becker, K. Teng, and T. K. Ooi, “Adaptive Spectral Imager for Space-Based Sensing,” Proc. SPIE Vol. 6206, paper 6206-33 (2006); Vujkovic-Cvijin, P., Jamin Lee, Brian Gregor, Neil Goldstein, Raphael Panfili and Marsha Fox, Infrared transform spectral imager, SPIE Optics+Photonics, 12-16 Aug. 2012, San Diego Calif., SPIE Proceedings Vol. 8520, paper 8520-19 (2012); Goldstein, N., P. Vujkovic-Cvijin, M. Fox, B. Gregor, J. Lee, J. Cline, and S. Adler-Golden “DMD-based adaptive spectral imagers for hyperspectral imagery and direct detection of spectral signatures,” SPIE Vol. 7210, 721008-1-721008-8 (2009); and U.S. Pat. No. 7,324,196, for example, to produce images with spatial, temporal and spectral content suitable for analysis by HTI/HSI algorithms described in this invention. FIG. 1 shows one embodiment of a spectral/temporal imager incorporating a programmable spatial light modulator (SLM) that allows programmable and adaptive selection of spectral bands. It incorporates a foreoptics to deliver an image into the object field of the instrument. As shown in FIG. 1, a DTIS instrument typically uses two polychromators with a two-dimensional SLM placed in the intermediate image plane between them. The polychromators may use any method to spectrally disperse the radiation (diffraction grating, a prism, a grism (combination prism and grating), or a combination thereof) to produce a dispersed image on the SLM, where each wavelength of light falls on a different SLM element, as shown in FIG. 1, which illustrates how rays for two different wavelengths, represented by dotted lines and solid lines, are focused on different locations on the SLM. The SLM can be a micromirror array (DMA), as described in Goldstein et al., U.S. Pat. Nos. 7,324,196 and 8,305,575, a reflective or diffractive ribbon array, a liquid-crystal array, an array of programmable apertures or any other one-dimensional or two-dimensional SLM. The first polychromator on FIG. 1 disperses the incoming image into spectral components and images them onto the DMA, where images are encoded by modulating the intensity of selected spectral bands. The encoded image is subsequently spectrally recombined by the second polychromator, which reversed the dispersion of the first polychromator, and imaged onto one or more detector arrays, such as a focal plane array (FPA). In a special case, the detector in the focal plane may contain one single detector. The DMA may be programmed to generate a dynamic series of spatial patterns (masks) that implement spectral encoding according to the mathematical transform selected for the particular application. The transform may be the Hadamard transform which provides high photon collection efficiency, since each of the Hadamard transform masks passes approximately half of the total photon flux collected, and each spectral channel is a normalized summation of half of the images. High photon collection efficiency results in high signal-to-noise (SNR) ratio. The implementation of the Hadamard transform may be in the form of Simplex (S) matrices, as described in U.S. Pat. Nos. 7,324,196 and 8,305,575. Spectral images may be decoded in the subsequent computation step via inverse Hadamard transform to yield spectral content of each pixel of the monitored scene. Since the instrument's SLM is programmable, it is possible to adjust spectral and temporal resolution in real time in order to optimize the tradeoff for dynamic monitoring requirements. Due to its encoding approach which involves both spectral dispersion and spectral recombination (de-dispersion) (FIG. 1), the FPA of the DTIS sensor captures a series of polychromatic images. The generation of spatially and temporally correct images from the encoded data is performed by the methods described above (FIG. 4 and FIG. 5) and described in detail above.

(35) In another embodiment shown in FIG. 3, and the corresponding image processing method shown on FIG. 6, multispectral imaging is performed by applying a series of bandpass filters over the entire image. An example of such system is a scanning apparatus that uses a rotating filter wheel composed of individual bandpass filters designed to select and isolate spectral regions of interest. Positioning of filters within the wheel can be either sequential or random, controlled by a mechanical positioning system. In the case of a rotating positioner, a digitally controlled stepper motor is typically used. The positioner operates under the control of the computer, as instructed by the system software. Multispectral imaging resulting from this approach lacks the spectral resolution of a HSI system and may be slower due to the use of a mechanical filter-positioning system. Bandpass-filter based MSI imager may also be slower in its ability to reconfigure itself on the fly by a digital command compared to a DTIS HSI system. However, MSI systems are far less complex and far less costly than DTIS HSI systems. As is evident from the description above, both HSI and MSI systems are capable of combined spectral and temporal imaging, with a variety of data acquisition parameters.

Alternative Embodiments

(36) In the case of DTIS-based transform-encoding imaging, alternative implementations may be used in the invention. These can include systems with all transmissive or all reflective optics, and systems that use a combination of transmissive and reflective elements. Such systems may include spectrometers of both Offner and Dyson relay type, which both use nearly-concentric optical elements to achieve imaging with low optical aberrations. Several alternative embodiments that may be used for the optical system of the HTI/HSI sensor are described in U.S. Pat. Nos. 7,324,196 and 8,305,575.

(37) In the case of complementary encoding, images may be acquired either by using a single focal plane array that sequentially captures two complementary encoded images created by a DMA (the preferred embodiment described above and in FIG. 1), or by recording the two images simultaneously. FIG. 2 shows an alternative implementation of the invention where the inherent mechanism of DMA spatial light modulation by micromirror tilting is used to advantage. As is well known to those skilled in the art, binary on-off modulation of DMA pixels is performed by tilting individual micromirrors by an equal angle in each direction relative to the normal to the underlying substrate. As a consequence, the DMA always generates two reflected images, inherently binary encoded with complementary transforms due to the complementarity of “on” and “off” pixels. Therefore, when a transform is applied to the DMA, one of the reflected images contains the original transform while the other reflected image is automatically encoded by the complement of the transform. If these two images are captured by two separated paths through the second polychromator (FIG. 2), input data for the summation process are created simultaneously by a single encoding mask. The separate paths can include 2 separate FPAs, as shown in FIG. 2, or alternatively, the two paths can be combined on a single FPA. As a result, of collecting both images simultaneously, the instrument acquires artifact-free spatial-temporal images at the full frame rate of the FPAs. This is in contrast to the case of a single FPA, where the need to generate the original and the complement mask sequentially makes the encoding sequence twice as long as the one with two FPAs.

(38) The processing techniques of this invention can also be applied to Fourier Transform Spectrometers, provided that the spectrometers include means of measuring either the complement of the encoding transform, or the time-varying panchromatic intensity from the source.

(39) Other embodiments will occur to those skilled in the art and are within the following claims.