Hemispherical star camera
10901190 ยท 2021-01-26
Assignee
Inventors
- Erik L. Waldron (Concord, MA, US)
- Gregory P. Blasche (Burlington, MA, US)
- Paul Bohn (Brighton, MA, US)
- Robin Mark Adrian Dawson (Waltham, MA, US)
- Walter Foley (Colton, NY, US)
- Samuel Harrison (Acton, MA, US)
- Matthew T. Jamula (Brighton, MA, US)
- Juha-Pekka J. Laine (Boston, MA, US)
- Benjamin F. Lane (Grafton, MA, US)
- Sean McClain (Somerville, MA, US)
- Francis J. Rogomentich (Wilmington, MA, US)
- Stephen P. Smith (Acton, MA, US)
- John James Boyle (Bourne, MA, US)
Cpc classification
G02B13/06
PHYSICS
H04N23/55
ELECTRICITY
H04N23/11
ELECTRICITY
H04N23/45
ELECTRICITY
H04N25/702
ELECTRICITY
H04N23/695
ELECTRICITY
International classification
G02B13/00
PHYSICS
G02B13/06
PHYSICS
Abstract
A digital camera optically couples a monocentric lens to image sensor arrays, without optical fibers, yet shields the image sensor arrays from stray light. In some digital cameras, baffles are disposed between an outer surface of a monocentric lens and each image sensor array to shield the image sensor arrays from stray light. In other such digital cameras, an opaque mask defines a set of apertures, one aperture per image sensor array, to limit the amount of stray light. Some digital cameras include both masks and baffles.
Claims
1. A digital camera, comprising: a monocentric lens having a focal length, an outer spherical surface and a center; a plurality of pixelated optical sensor arrays, each pixelated optical sensor array having a plurality of pixels and being oriented toward the center of the monocentric lens and spaced apart from the outer spherical surface of the monocentric lens, such that the pixelated optical sensor array is disposed about the focal length of the monocentric lens from the center of the monocentric lens; and a plurality of tubular baffles, one tubular baffle of the plurality of tubular baffles for each pixelated optical sensor array of the plurality of pixelated optical sensor arrays, the baffle being disposed between the outer spherical surface of the monocentric lens and the pixelated optical sensor array, the baffle corresponding to the pixelated optical sensor array and having a longitudinal axis normal to the baffle's corresponding pixelated optical sensor array and extending through the center of the monocentric lens, the baffle enclosing a light path volume through which light passes optically unaltered while blocking zeroth-order stray light paths, and, the baffle being disposed such that only light that enters and exits the monocentric lens via the outer spherical surface, without internally reflecting off any planar surface of the monocentric lens, enters the baffle.
2. A digital camera according to claim 1, wherein a portion, less than all, of the outer spherical surface of the monocentric lens comprises a mask that defines a plurality of transparent apertures therethrough and is otherwise opaque at a predetermined wavelength, such that each aperture of the plurality of apertures is aligned with a respective baffle of the plurality of tubular baffles and limits an amount of light that can pass from the monocentric lens to the corresponding pixelated optical sensor array, wherein the mask is shaped as at least a portion of a spherical surface.
3. A digital camera according to claim 1, wherein the monocentric lens has no central aperture.
4. A digital camera according to claim 1, wherein each baffle is cylindrical.
5. A digital camera according to claim 1, wherein each baffle is frustoconical.
6. A digital camera according to claim 1, wherein a wall of at least one baffle is opaque at a predetermined wavelength.
7. A digital camera according to claim 1, wherein a wall of at least one baffle has a total hemispherical reflectivity of less than about 25% at predetermined wavelength.
8. A digital camera according to claim 1, wherein at least one baffle is spaced apart from the outer spherical surface of the monocentric lens by at least about 1 mm.
9. A digital camera according to claim 1, wherein one end of at least one baffle is in contact with the outer spherical surface of the monocentric lens.
10. A digital camera according to claim 1, wherein at least one baffle is spaced apart from the baffle's corresponding pixelated optical sensor array by at least about 1 mm.
11. A digital camera according to claim 1, wherein one end of at least one baffle is in contact with the baffle's corresponding pixelated optical sensor array.
12. A digital camera according to claim 1, wherein at least one pixelated optical sensor array of the plurality of pixelated optical sensor arrays is planar.
13. A digital camera according to claim 1, wherein at least one pixelated optical sensor array of the plurality of pixelated optical sensor arrays is curved.
14. A digital camera according to claim 1, wherein at least one pixelated optical sensor array of the plurality of pixelated optical sensor arrays is substantially parallel to the outer spherical surface of the monocentric lens.
15. A digital camera according to claim 1, further comprising: an object catalog storing information about objects expected to be viewed by the digital camera; and a navigation controller communicatively coupled to the object catalog and to the plurality of pixelated optical sensor arrays, wherein the navigation controller uses at least some of the information stored in the object catalog and image data from at least one pixelated optical sensor array of the plurality of pixelated optical sensor arrays to automatically determine at least one of: a location of the digital camera and an orientation of the digital camera.
16. A digital camera according to claim 1, further comprising: an object catalog storing information about objects expected to be viewed by the digital camera; and a navigation controller communicatively coupled to the object catalog and to the plurality of pixelated optical sensor arrays, wherein the navigation controller uses at least some of the information stored in the object catalog and image data from at least one pixelated optical sensor array of the plurality of pixelated optical sensor arrays to automatically generate course correction information.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1) The invention will be more fully understood by referring to the following Detailed Description of Specific Embodiments in conjunction with the Drawings, of which:
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
(12)
(13)
(14)
(15)
(16)
(17)
(18)
DETAILED DESCRIPTION OF SPECIFIC EMBODIMENTS
(19) In accordance with embodiments of the present invention, methods and apparatus are disclosed for high signal-to-noise digital cameras with monocentric lenses that are optically coupled to image sensor arrays without optical fibers, yet do not require external baffles to reduce stray light impingement on the image sensor arrays.
(20) Some conventional image-based trackers include wide field-of-view digital cameras.
(21) As shown schematically in
(22) Additional information about monocentric lenses is available in Lens Design Fundamentals, by Rudolf Kingslake, Academic Press, Dec. 2, 2012, the entire contents of which are hereby incorporated by reference, for all purposes. As described by Kingslake, A lens in which all the surfaces are concentric about a single point is called monocentric. The nodal points of such a lens are, of course, at the common center because any ray directed toward this center is undeviated. Hence the principal and nodal points also coincide at the common center. The image of a distant object is also a sphere centered about the same common center, of radius equal to the focal length. Monocentric systems can be entirely refracting or may include reflecting surfaces.
(23) Additional information about prior art cameras is available in Optimization of two-glass monocentric lenses for compact panoramic imagers: general aberration analysis and specific designs, by Igor Stamenov, Ilya P. Agurok and Joseph E. Ford, Applied Optics, Vol. 51, No. 31, Nov. 1, 2012, pp. 7648-7661, as well as U.S. Pat. No. 3,166,623 titled Spherical Lens Imaging Device, by J. A. Waidelch, Jr., filed Dec. 29, 1960, the entire contents of all of which are hereby incorporated by reference herein for all purposes. The camera 100 is conceptually similar to a larger monocentric objective camera called AWARE2 and developed at Duke University.
(24) The ball lens 102 enables the digital camera 100 (
(25) Embodiments of the present invention optically couple monocentric lenses to image sensor arrays without optical fibers, yet shield the image sensor arrays from stray light. In some embodiments, baffles are disposed between an outer surface of a monocentric lens and each image sensor array to shield the image sensor arrays from stray light. In some embodiments, an opaque mask defines a set of apertures, one aperture per image sensor array, to limit the amount of stray light. Some embodiments include both masks and baffles.
Baffle Embodiments
(26)
(27) In some embodiments, each optical sensor array is adjustably mounted to a frame, so tip, tilt and focus of the optical sensor array may be independently adjusted. In some embodiments, the monocentric lens is mounted to a frame by struts that compensate for thermal expansion of other components.
(28) The hemispherical star camera 1400 includes several digital camera assemblies.
(29) In
(30)
(31) Returning to
(32) As noted,
(33) Collectively, the adjusters 1710-1714 change tip, tilt and focus (distance) of the digital camera 1702, relative to the monocentric lens 1402 (
(34) Each adjuster 1710-1714 may include a manually-adjustable screw that controls the length of the adjuster. In some embodiments, each adjuster 1710-1714 is motor driven, and its motor (not shown) is controlled by a computer (not shown). The computer may analyze images from the digital camera 1702, such as for image quality, such as center, focus, contrast, modulation transfer function (MTF) or any other suitable measure, and automatically drive the motors to turn one or more of the adjusters 1710-1714, as needed, to change the tip, tilt and/or focus to improve the image quality.
(35) The adjusters 1710-1714 may be manually or automatically adjusted to achieve a best compromise focus across an entire surface of an optical sensor array of the digital camera 1702. The adjusters 1710-1714 may be manually or automatically adjusted to compensate for movement of elements resulting from vibration of launch or the like.
(36) The pixelated optical sensor arrays 306 are electrically coupled to processing electronics 312.
(37) In use, the hemispherical star camera 300 may image one or more stars, natural or artificial satellites, other relatively bright navigational objects, terrain, sea surface, target landmarks or the like. The navigation controller 1302 may use at least some of the information stored in the object catalog 1300 and image data from at least one pixelated optical sensor array of the set of pixelated optical sensor arrays 306 to automatically determine a location of the camera 300 and/or an orientation (attitude) of the camera 300, such as for an artificial satellite or other vehicle. For example, the navigation controller may compare image data from the set of pixelated optical sensor arrays 306 to expected image data stored in the object catalog 1300. The expected image data may include images or the like expected to be observed along a desired path and, optionally, images or the like expected to be observed along one or more incorrect (undesired) paths. Thus, if actual observed image data matches the desired image data, the navigation controller may conclude the camera 300 is traveling along the desired path. The processing electronics 312 may, but need not, use at least some of the information stored in the object catalog 1300 and image data from at least one pixelated optical sensor array of the set of pixelated optical sensor arrays 306 to automatically generate course correction information for a vehicle. For example, if actual observed image data does not match the desire image data, the navigation controller may conclude the camera 300 is not traveling along the desired path. A difference between actual observed image data and the desired path image data (or image data related to an incorrect path) may be used to calculate a course correction.
(38) Optionally or alternatively, the processing electronics 312 may include image processing circuits and/or software (collectively referred to as an image processor) 1304, such as for compressing the data from the pixelated optical sensor arrays 306. Optionally or alternatively, the processing electronics 312 may send location information, orientation information, course correction information, processed or unprocessed data from the pixelated optical sensor arrays 306 or other data (collectively indicated at 1306) the processing electronics 312 derives from the data from the pixelated optical sensor arrays 306 to another system or subsystem, indicated at 1308, for processing. The other system or subsystem may be in the same vehicle as the camera 300 or it may be external to the vehicle, such as in a ground station.
(39) As noted, in some embodiments, baffles are used to shield the pixelated optical sensor arrays 306 from stray light. Baffles are cylindrical or frustoconical shaped tubes used to enclose a light path or block zeroth-order stray light paths. Vanes are structures on baffles that block light scattered from the baffles.
(40) Frustum means a cone or pyramid whose tip has been truncated by a plane parallel to its base. Frustoconical means having the shape of a frustum of a cone. Frusta is the plural form of the noun frustum.
(41)
(42) As shown in
(43) Returning to
(44) The monocentric lens 302 has no central aperture. Each baffle 400-402 defines an aperture, through which light intended to impinge on the baffle's corresponding pixelated optical sensor array passes. The aperture may be defined by one end of the baffle, the other end of the baffle, some inside circumference of the baffle intermediate the two ends of the baffle or a combination of two of more thereof.
(45) An entrance pupil is an optical image of a physical aperture stop, as seen through the front of a lens system. The corresponding image of the aperture as seen through the back of the lens system is called an exit pupil. The entrance pupil is usually a virtual image that lies behind the first optical surface of the optical system. The aperture defined by each baffle 400-402 creates a corresponding entrance pupil and a corresponding exit pupil.
(46) In the embodiment shown in
(47) Returning again to
(48) Additional information about stray light management may be found in Stray Light Analysis and Control, especially Chapter 9, Baffle and Cold Shield Design, by Eric. C. Fest, SPIE Press, 2013, the entire contents of which are incorporated herein by reference for all purposes.
Mask Embodiments
(49) As noted, in some embodiments, an opaque mask defines a set of apertures, one aperture per image sensor array. A portion 800 of a hemispherical star camera that includes such a mask 802 is schematically illustrated in
(50) In either case, as schematically illustrated in
(51) Each aperture 1000-1004 is centered on a respective imaginary line normal to the aperture's corresponding pixelated optical sensor array 306 and extending through the center 308 of the monocentric lens 302. Such lines are exemplified by dashed lines 810 and 812 in
(52) The apertures 1000-1004 permit light to travel through to the pixelated optical sensor arrays 306. Each aperture 1000-1004 limits an amount of light that can pass from the monocentric lens 302 to the aperture's corresponding pixelated optical sensor array 306. The mask 802 or 902 is otherwise opaque at a predetermined wavelength.
(53) Some embodiments include baffles and a mask.
(54)
(55) Thus, hemispherical star cameras that include baffles only, masks only and combinations of masks and baffles have been described. These and other embodiments of the present invention provide stray light rejection, without requiring optical fiber couplings between a lens and a set of pixelated optical sensor arrays, and without bulk and/or weight associated with external baffles.
(56) While the invention is described through the above-described exemplary embodiments, modifications to, and variations of, the illustrated embodiments may be made without departing from the inventive concepts disclosed herein. Furthermore, disclosed aspects, or portions thereof, may be combined in ways not listed above and/or not explicitly claimed. Accordingly, the invention should not be viewed as being limited to the disclosed embodiments.
(57) Although aspects of embodiments may be described with reference to flowcharts and/or block diagrams, functions, operations, decisions, etc. of all or a portion of each block, or a combination of blocks, may be combined, separated into separate operations or performed in other orders. All or a portion of each block, or a combination of blocks, may be implemented as computer program instructions (such as software), hardware (such as combinatorial logic, Application Specific Integrated Circuits (ASICs), Field-Programmable Gate Arrays (FPGAs) or other hardware), firmware or combinations thereof.
(58) Embodiments may be implemented by a processor executing, or controlled by, instructions stored in a memory. The memory may be random access memory (RAM), read-only memory (ROM), flash memory or any other memory, or combination thereof, suitable for storing control software or other instructions and data. Instructions defining the functions of the present invention may be delivered to a processor in many forms, including, but not limited to, information permanently stored on tangible non-writable storage media (e.g., read-only memory devices within a computer, such as ROM, or devices readable by a computer I/O attachment, such as CD-ROM or DVD disks), information alterably stored on tangible writable storage media (e.g., floppy disks, removable flash memory and hard drives) or information conveyed to a computer through a communication medium, including wired or wireless computer networks.