Real-time satellite imaging system
11496679 · 2022-11-08
Assignee
Inventors
Cpc classification
B64G1/1028
PERFORMING OPERATIONS; TRANSPORTING
H04N7/181
ELECTRICITY
H04N23/90
ELECTRICITY
G02B23/04
PHYSICS
H04N23/951
ELECTRICITY
International classification
Abstract
Methods and apparatus for Real-time Satellite Imaging System (10) are disclosed. More particularly, one embodiment of the present invention an imaging sensor (14) on a geostationary satellite having one or more co-collimated telescopes (18). The telescopes (18) illuminate local planes (22) which are sparsely populated with focal plane arrays (24). The focal plane arrays (24) record the entire observable Earth hemisphere at one time, at least once every ten seconds.
Claims
1. An apparatus comprising: a satellite; said satellite being in orbit around the Earth; said satellite having an apogee generally out of the plane of the Equator; said satellite providing generally persistent coverage of the polar regions not observable from geosynchronous orbit; an image sensor; said image sensor being carried aboard said satellite; said image sensor including a plurality of co-collimated telescopes; said co collimated telescopes being configured to produce a plurality of images of the same field of view; said image sensor being configured to be pointed generally toward Earth; said image sensor is a staring sensor; said image sensor including a focal plane; said focal plane including a plurality of focal plane arrays; said focal plane arrays being configured to produce a plurality of data; said focal plane arrays being sparsely populated upon said focal plane of each of said co-collimated telescopes; said plurality of focal plane arrays being stitched together to form a single cohesive image; a sensor image controller and a sensor image readout; said sensor image controller and said sensor image readout being connected to said plurality of focal plane arrays; said sensor image controller and said sensor image readout for semi-autonomously controlling said image sensor, and for producing fixed resolutions to provide optimized imagery and data, and still allow a frame rate of better than one frame every 10 seconds; said plurality of data including of a plurality of images of a generally entire visible hemisphere of the surface of the Earth; said plurality of data providing persistent imaging of said generally entire visible hemisphere of the surface of the Earth; said plurality of images each being captured as images substantially simultaneously; said plurality of images each having resolutions that correspond with an image at nadir having at least one hundred meter resolution; and a transmitter; said transmitter being connected to said sensor image controller and said sensor image readout.
2. An apparatus as recited in claim 1, in which said image sensor is being carried aboard a satellite in geostationary orbit.
3. An apparatus as recited in claim 1, in which each of said plurality of focal plane arrays is connected to a field programmable gate array.
4. An apparatus as recited in claim 1, in which each of said plurality of focal plane arrays is connected to a sensor image control element; each of said sensor image control elements is configured to limit a light integration time of its associated focal plane array.
5. An apparatus as recited in claim 1, in which each of said plurality of said sensor image control elements are configured to set the number of frames per second delivered in the plurality of data from the plurality of focal plane arrays from one-tenth of a frame per second to thirty frames per second.
6. An apparatus as recited in claim 1, further comprising: said sensor image readout element; said sensor image readout element being configured to readout in parallel said plurality of data from said plurality or focal plane arrays.
7. An apparatus as recited in claim 1, in which said sensor image readout elements are configured to transmit a maximum light level of each of said plurality of focal plane arrays to said sensor image control element.
8. An apparatus as recited in claim 1, in which said sensor image readout elements are configured to transmit a maximum light level of each of said plurality of focal plane array sub elements to said sensor image control element.
9. An apparatus as recited in claim 1, further comprising: an image processing element; said image processing element including a plurality of software controlled electronic processing elements; said software controlled electronic processing elements being configured to receive the raw levels of said plurality of data from said plurality of local plane arrays.
10. An apparatus as recited in claim 1, further comprising: an electrically reactive high index of refraction material plate; a sensor image control element; said sensor image control element including a voltage control element; said voltage control element being configured to apply a plurality of different voltages across said electrically reactive high index of refraction material plate; said electrically reactive high index of refraction material plate altering the direction of a plurality of light rays illuminating each of said plurality of focal plane arrays.
11. An apparatus as recited claim 1, in which said plurality of images being furnished for viewing at a remote location on Earth within not more than thirty seconds of the event being observed generally on the Earth.
12. An apparatus as recited in claim 11, in which said image sensor is a staring sensor.
13. An apparatus as recited in claim 11, in which each of said plurality of focal plane arrays is connected to a field programmable gate array.
14. An apparatus as recited in claim 11, in which each of said plurality of focal plane arrays is connected to a sensor image control element; each of said sensor image control elements is configured to limit a light integration time of its associated focal plane array from one microsecond to ten seconds.
15. An apparatus as recited in claim 11, in which each of said plurality of said sensor image control elements are configured to set the number of frames per second delivered in the plurality of data from the plurality of focal plane arrays from a frame every ten seconds to twenty frames per second.
16. An apparatus as recited in claim 11, further comprising: a sensor image readout element; said sensor image readout element being configured to readout in parallel said plurality of data from said plurality of focal plane arrays.
17. An apparatus as recited in claim 11, in which said sensor image readout element is configured to transmit a maximum light level of said focal plane array to said sensor image control element.
18. An apparatus as recited in claim 11, further comprising: an image processing element; said image processing element including a software controlled electronic processing element; said software controlled electronic processing element being configured to receive the raw levels of said plurality of data from said plurality of focal plane arrays.
19. An apparatus as recited in claim 11, in which said plurality of images being furnished for viewing at a remote location on Earth within not more than thirty seconds of the event being observed generally on the Earth.
20. An apparatus comprising: A satellite; said satellites orbiting the earth in a molniya orbit; said satellite providing generally continuous coverage of the polar regions not observable from geosynchronous orbit; an image sensor; said image sensor being carried aboard said satellite; said image sensor including a plurality of co-collimated telescopes; said co-collimated telescopes being configured to produce a plurality of images of the same field of view; said image sensor being carried aboard a satellite in geostationary orbit: said image sensor being configured to be pointed generally toward earth: said image sensor including a focal plane; said focal plane including a plurality of focal plane arrays; said focal plane arrays being configured to produce a plurality of data; said plurality of focal plane arrays being stitched together to form a single cohesive image; said focal plane arrays being sparsely populated upon said focal plane of each of said co-collimated telescopes; sensor image controller and a sensor image readout; said sensor image controller and said sensor image readout being connected to said plurality of focal plane arrays; said sensor image controller and said sensor image readout for semi-autonomously controlling said image sensor, and for producing fixed resolutions to provide Optimized imagery and data, and still allow one frame per second frame rates; said plurality of data including of a plurality of images of a generally entire visible hemisphere of the surface of the Earth; said plurality or data providing persistent imaging of said generally entire visible hemisphere of the surface of the Earth; said plurality of images each being captured as full images substantially simultaneously; said plurality of images each having resolutions that correspond with an image at nadir having at least one hundred meter resolution; and a transmitter; said transmitter being connected to said focal plane arrays; a parallel to serial bit stream serializer connected to said transmitter; said serial bit stream serializer for converting the parallel inputs from the detection elements of within said image processor to a serial bit stream in preparation for transmission.
21. An apparatus comprising: a satellite; said satellite being in orbit in a highly inclined, highly elliptical Orbit: said satellite having an orbit period longer than twelve hours; said satellite providing a longer persistence over the polar regions; said satellite providing generally continuous coverage of the polar regions not observable from geosynchronous orbit; an image sensor; said image sensor being carried aboard said satellite; said image sensor including a plurality of co-collimated telescopes; said co-collimated telescopes being configured to produce a plurality of images of the same field of view; said image sensor being carried aboard a satellite in geostationary orbit; said image sensor being configured to be pointed generally toward Earth; said image sensor including a focal plane; said focal plane including a plurality of focal plane arrays; said focal plane arrays being, configured to produce a plurality of data; said plurality of local plane arrays being stitched together to form a single cohesive image; said focal plane arrays being sparsely populated upon said focal plane of each of said co-collimated telescopes; a sensor image controller and a sensor image readout; said sensor image controller and said sensor image readout being connected to said plurality of focal plane arrays; said sensor image controller and said sensor image readout for semi-autonomously controlling said image sensor, and for producing fixed resolutions to provide optimized imagery and data, and still allow one frame per second frame rates; said plurality of data including of a plurality of images of a generally entire visible hemisphere of the surface of the Earth; said plurality of data providing persistent imaging of said generally entire visible hemisphere of the surface of the Earth; said plurality of images each being captured as full images substantially simultaneously; said plurality of images each having resolutions that correspond with an image at nadir having at least one hundred meter resolution; and a transmitter; said transmitter being connected to said focal plane arrays; a parallel to serial bit stream serializer connected to said transmitter; said serial bit stream serializer for converting the parallel inputs from the detection elements of within said image processor to a serial bit stream in preparation for transmission.
22. An apparatus comprising: a satellite; said satellite being placed at a stable Earth-Moon Lagrange Point; an image sensor; said image sensor being carried aboard said satellite; said image sensor including a plurality of co-collimated telescopes; said co collimated telescopes being configured to produce a plurality of images of the same field of view; said image sensor being carried aboard a satellite in geostationary orbit; said image sensor being configured to be pointed generally toward Earth; said image sensor including a focal plane; said focal plane including a plurality of focal plane arrays; said focal plane arrays being configured to produce a plurality of data; said plurality of focal plane arrays being stitched together to form a single cohesive image; said focal plane arrays being sparsely populated upon said focal plane of each of said co-collimated telescopes; a sensor image controller and a sensor image readout; said sensor image controller and said sensor image readout being connected to said plurality of focal plane arrays; said sensor image controller and said sensor image readout for semi-autonomously controlling said image sensor, and for producing fixed resolutions to provide optimized imagery and data, and still allow one frame per second frame rates; said plurality of data including of a plurality of images of a generally entire visible hemisphere of the surface of the Earth; said plurality of data providing persistent imaging of said generally entire visible hemisphere of the surface of the Earth; said plurality of images each being captured as full images substantially simultaneously; said plurality of images each having resolutions that correspond with an image at nadir having at least one hundred meter resolution; and a transmitter; said transmitter being connected to said focal plane arrays; a parallel to serial bit stream serializer connected to said transmitter; said serial bit stream serializer for converting the parallel inputs from the detection elements of within said image processor to a serial bit stream in preparation for transmission.
Description
A BRIEF DESCRIPUON OF THE DRAWINGS
(1)
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
(12)
(13)
(14)
(15)
(16)
(17)
(18)
(19)
(20)
(21)
(22)
(23)
(24)
(25)
(26)
(27)
(28)
(29)
(30)
(31)
(32)
(33)
(34)
(35)
(36)
(37)
(38)
(39)
(40)
(41)
A DETAILED DESCRIPTION OF PREFERRED & ALTERNATIVE EMBODIMENTS
(42) I. Overview of the Invention
(43)
(44) A geostationary satellite appears to remain motionless in orbit relative to the Earth, because its orbit keeps it over the same region of the Earth's surface. In this way, the relative motions of the Earth and the satellite are nulled. The geostationary orbit is generally contained in the Earth's Equatorial plane, and the revolution rate of the satellite in its orbit is the same as the rotational rate of the Earth.
(45) The geostationary orbit (GSO) is a distinct and specific subset of geosynchronous Earth orbit (GEO). A GEO orbit only has to be over the exact same spot on the Earth's surface once per day that the satellite's orbit is synchronous with the Earth's rotation, while GSO orbits are such that the satellite appears to be continuously stationary in the sky above a specific spot on the Earth's Equator. Accordingly, consistent images may be taken of a portion of the Earth's surface and atmosphere that fall within the hemisphere of the Earth viewable by the GSO satellite, i.e., the observable hemisphere, which is typically referred to as the “footprint of the satellite.” The observable hemisphere is not a full geographic hemisphere, i.e., not fifty percent of the Earth, as the GSO is too close to the Earth to see the extreme polar regions of the Earth, i.e., generally above +81.3 degrees latitude or below −81.3 degrees latitude.
(46) The advantage of placing a satellite with a sensor imaging the observable hemisphere in a geostationary orbit is that it allows the use of a staring sensor. This allows full images of the observable hemisphere to be taken at the same time. This avoids the image distortions of
(47) In other embodiments of the invention, the satellite or satellites may be in other orbits, such as Molniya orbits or other inclined elliptical orbits. These alternative orbits allow true polar coverage with four or more satellites (two with apogees north of the Equator and two with apogees south of the Equator), giving true 24/7 coverage of the polar regions not observable by the GSO satellites. Additionally, highly inclined, highly elliptical orbit (HEO) satellites with orbit periods longer than the twelve hour orbit of a Molniya orbit can also be used to give longer persistence per satellite over the polar regions. These longer persistence HEO satellites allow significantly longer periods of persistence per satellite sensor than Molniya orbit satellites can as well as allowing more slowly changing nadir point of the satellite allowing the sensor to have much less push or smear of the imagery than a Molniya orbit will. Further, satellites can be placed at the two stable Earth-Moon Lagrange Points, typically referred as L4 and L5, to give even longer persistence and a near full hemispherical observation area. With the L4 and L5 points at the same distance from the Earth as the Moon is from the Earth, sensors on satellites placed at these locations can give a perspective that changes slowly as the Earth-Moon orientation changes, something that is not possible from GSO, Molniya, or HEO orbits.
(48) II. A Detailed Description of One Preferred Embodiment of the Invention
(49)
(50)
(51)
(52)
(53)
(54)
(55) The focal plane arrays in each of the four focal planes is sparsely populated but with a overlaps 34 when the image of each of the focal planes 24 is combined to form a single image across all the local plane arrays 24. For the central focal plane array 24 of a set of nine in this example, in this diagram shown as FPA 24DA, the focal plane array 24 has overlap on all for sides from the neighboring local plane arrays are physically located in the focal planes of the other co-collimated telescopes. Also the overlap is triple at the corners as, taking the top-right corner of FPA 24DA as the example, the overlap from the focal plane 24 above FPA 24BA, to the right FPA 24CB, and to the upper-right FPA 24AB all overlap with the upper-right corner of FPA 24DA giving a redundant set of pixels from four different FPAs from four different co-collimated telescopes and four different focal planes to align the final imagery from the focal plane arrays 24 that are physically located across multiple co-collimated telescopes.
(56)
(57) The focal plane arrays 24 are read out by the sensor image readout 42, which includes the software controlled readout electronic elements 46. The images and data 44 from the focal plane arrays 24 are transferred from the sensor image readout 42 software controlled readout electronics 40 to an image processor 48, which includes software controlled processing electronic elements 52. The software controlled readout electronics elements 46 additionally transfer data such as maximum light levels 50 to the software controlled electronics 40 of the sensor image controller 38 to optimize control functions. The software controlled processing electronic elements 52 of the image processor 48 transfer processed images 54 to transmitter 16.
(58)
(59) control integration time per FPA 62;
(60) control integration time segment of FPA 66;
(61) control frame rate of a segment of a FPA 68; and
(62) control frame rate of FPA 64.
(63) The use of field programmable gate arrays (FPGAs) as described here offers multiple advantages over discrete electronic circuits and application specific integrated circuits (ASICs). FPGAs can be updated with new firmware while the sensor and satellite are on orbit while discrete electronics and ASICs cannot be. The ability to update FPGAs allows the operator of the satellite and sensor to update the satellite and sensor to new algorithms and circuit implementations developed after the satellite is launched FPGAs allow the sensor and satellite to realize a better size, weight, and power requirement than discrete electronic circuits.
(64)
(65)
(66)
(67)
(68)
(69)
(70) The satellite and sensor embodiments described in
(71)
(72)
(73)
(74)
(75)
(76) The satellite and sensor embodiments described in
(77)
(78)
(79)
(80)
(81)
(82)
(83)
(84) The satellite and sensor embodiments described in
(85)
(86) The satellite and sensor embodiments described in
(87)
(88)
(89)
(90) The satellite and sensor embodiments described in
(91)
(92)
(93)
(94)
(95)
(96) Further, in
(97)
(98) The Max Brightness Track & Hold electronics 264 stores the values of the brightest pixels within the imagery data 262 and passes it to the Brightness Level Detection electronics 266 within the Image Controller FPGA 60 within the Software Controlled Electronics 40 within the Sensor Image Controller 38. The Brightness Level Detection element 266 passes the brightness levels to the Brightness versus Saturation Level Detection element 268 which determines the brightness of the brightest pixels within the FPA as compared to the maximum, saturated level possible within the FPA determining if those pixels are not saturated, saturated, or super saturated. This determination is sent to the Saturation Level Set 270 which sets the requirements for any changes so that the next frame captured has just barely saturated pixels.
(99) The Saturation Level Set element 270 sends that determination to the Next Frame Integration Time Determination element 272 which determines the integration time duration of the next frame in order to have the brightest pixels in the next frame saturated but not super saturated. The Next Frame Integration Time Determination element 272 sends that determination to the FPA Integration Duration Set element 274 which sends the set data to the FPA 24 via the FPA integration Duration Set. Signal 276. The FPA Integration Duration Set Signal 276 is also sent to the Image Processing Element 52 within the image Processor 48. Further, within the Image Controller Electronics 40 the Conversion Trigger Electronics 204 sends the Conversion Trigger 278 to the FPA and the Readout Trigger electronics 206 sends the Readout Trigger 280 to both the FPA and the Pixel Readout 260.
(100)
(101) As further shown in
(102) Further, as shown
(103) An imaging satellite with a sensor control and data processing system configured to semi-autonomously control a sensor, and produce variable resolutions as the system will allow image frame by image frame updating of the sensor to provide optimized imagery and data and still allow one frame per second frame rates while allowing the combination of pixels or sets of pixels into different resolutions before transmission to the ground. Such variance of resolutions can provide the ability to lower the compression rate in data rate limited transmissions.
(104) An imaging satellite with a sensor control and data processing system configured to semi-autonomously control a sensor, and produce fixed frame rates which allows the sensor to optimally image an extended area of the Earth as opposed to prior art sensors with permanently fixed or narrowly pre-defined frame rates or scanning rates. An example is to image a daylight area at one fixed rate and a night area at a different fixed rate allowing optimal imagery for each.
(105) An imaging satellite with a sensor control and data processing system configured to semi-autonomously control a sensor, and produce variable frame rates which allows the sensor to optimally image an area of the Earth as lighting and environmental conditions change as opposed to prior art sensors with permanently fixed frame rates or scanning rates or fixed rate sensors which cannot rapidly adapt to changing environments. An example is to image a daylight area and optimally change the frame rate as the daylight area goes from early morning with moderately low light to high noon with bright lighting to evening with moderately low light.
(106) A sensor control and data processing system that monitors the saturation and pixel received energy levels of a sensor's focal plane array (FPA) and modifies the FRA's relevant operational parameters, to maximize the signal to noise ratio of the imagery generated as opposed to just moderating the integration time of the FPA. One example of this is that for some FPAs the FPA electronic bias can be changed to optimize the range of light that signifies true darkness, i.e., no light in for a given pixel to maximum brightness, i.e., so much light on a given pixel that it is super saturated and records a fraction of that amount of light. Dynamically controlling that range in an FPA can produce optimal imagery.
(107) A sensor control system that monitors a predefined subset: of the pixels of each FPA may capture predefined signatures or can be used to capture predefined motions.
(108) A sensor control system that monitors all of the pixels of each FPA may be used to control the frame rates or bias of an FPA for the next frame.
(109) A sensor control and data processing system that produces imagery at full resolution of the field of view of the additional image sensors as the sensor described herein may be used in coordination with another sensor onboard each satellite with the other sensor observing a subset of the area observed by the sensor described herein at a different frame rate or resolution.
(110) A data processing system combining, imagery of full and lesser resolution images with the imagery from a narrow field image to provide data fused images allows the imagery and the data from the two sensors to be combined to produce unique information that is not available from either sensor alone.
(111) The present invention uses a novel combination of elements to produce full hemisphere images in only thirty seconds or less to the end user.
(112) The present invention uses a “fractionated sensor” to take the hemispherical images all at once. According to the present invention, the minimum time to take, record, and readout a full observable hemispherical image is about 1/30 of a second. The present invention accomplishes this task by employing a novel combination of multiple, co-collimated telescopes, focal planes, and sparsely populated focal plane arrays as described above in the Detailed Description Section.
(113) The present invention does not “build up” or accumulate a full hemisphere image by using a scanning sensor. The image produced by the present invention is assembled simultaneously with multiple, imaging two-dimensional focal plane arrays all at the same time. The present invention uses directly illuminated focal planes with one focal plane per telescope and multiple, sparsely populated local plane arrays for each focal plane. The focal plane arrays are sparsely populated on each individual focal plane, but they are overlapping when the focal plane arrays from the multiple focal planes are combined to form a single image from all the co-collimated telescopes, fiscal planes, and focal plane arrays.
(114) According to one embodiment of the present invention, the entire process, which begins when an event occurs, to the delivery to a user requires less than 30 seconds.
(115) One embodiment of the present invention may be described as a series of events, as recited in Table One:
(116) Table One
(117) Event happens
(118) Light travels from the event to the satellite sensor telescopes
(119) Light is focused on the focal plane of each telescope
(120) Light is captured by the focal plane arrays in the focal plane of each telescope and read out as voltages
(121) Voltages are processed on the satellite for transmission
(122) Imagery and data are transmitted to the ground
(123) The ground station receives the data stream and “unpacks” the data
(124) The ground station sends the data to the ground processing site
(125) The ground processing site takes the raw imagery and data and turns in into composed, user observable images
(126) In parallel to the creation of composed, user observable images, the ground processing site extracts selected data from the images
(127) The composed images are trans erred to storage (for future use) and the distribution system
(128) The distribution system (Internet, other communications satellites, etc.) delivers the live images and related, extracted data to the user
(129) III. Alternative Embodiments of the Invention
(130) The present invention may be implemented in a wide variety of embodiments. Alternative embodiments include, but are not limited to:
(131) An imaging satellite with an image sensor including a focal plane array consisting of a charge coupled device (CCD) array as a CCD array allows sensitivity not afforded by other arrays and in many cases has greater radiation tolerance than other arrays, which can be important in GSO because GSO satellites and sensors are in a higher radiation environment than LEO satellites.
(132) An imaging satellite with an image sensor minding a focal plane array consisting of a complementary metal-oxide-semiconductor (CMOS) array as a CMOS array has a wider family of off the shelf supporting electronics giving greater flexibility and potentially greater capability than other arrays.
(133) An imaging satellite with an image sensor including a focal plane array consisting, of a scientific CMOS (SCMOS) array as SCMOS has the potential to allow greater responsiveness, and thus the ability to do better low light imaging, than other arrays.
(134) An imaging satellite with an image sensor including a focal plane array consisting of a micro-bolometer (μ-bolometer) array as a μ-bolometer allows imaging in mid-wave through long wave (typically 4 μm through 20 μm) without the use of cryo-coolers.
(135) An imaging sensor with a focal plane array using a stacked sensor such that it is capable of recording color in each individual pixel, e.g., FOVEON Array, which snows each pixel to provide multiple colors of information, which in turn allows for denser spectral information within each focal plane array.
(136) An imaging sensor with a focal plane array using a 2×2, 2×3, 2×4, 3×3, 3×4, or 4×4 array of pixels with different filters to allow for the creation of full color imagery as such variants allow the spectral variations to include the classic Bayer array and go beyond the classic Bayer array to give additional, spectrally relevant information.
(137) An imaging sensor with a primary image sensor comprised an optical train illuminating a focal plane array allowing the full spectrum of the received light to irradiate a full set of focal plane arrays.
(138) An imaging sensor with primary image sensor comprised of each telescope encompassing a single optical train directly illuminating a set of optical beam splitters such that each split is for a different frequency/wavelength and each split frequency illuminates a focal plane array, which in turn allows a full focal plane array for each frequency of light.
(139) An imaging sensor comprised of a single optical train per telescope directly illuminating a diffraction grating such that each different frequency illuminates a focal plane arrays, which allows for a more compact version than beam splitters but with the limitation of narrow bands tuned to the grating.
(140) An imaging sensor observing and recording imagery and data of a cooperative target in real-time which can allow the sensor in GSO to persistently stare at the cooperative target as the sensor is accurately characterized.
(141) An imaging sensor in GSO persistently observing and recording imagery and data of a target on the Earth to characterize that target.
(142) An imaging sensor in GSO persistently observing and recording imagery and data of a target in the Earth's atmosphere to characterize that target.
(143) An imaging sensor in GSO persistently observing and recording imagery and data of a target on the Earth to characterize the sensor.
(144) An imaging sensor in GSO persistently observing and recording imagery and data of a target in the Earth's atmosphere to characterize the sensor.
(145) An imaging sensor wherein a cooperative target is data linked to the satellite's primary ground station, which allows the cooperative target to change its characteristics as the persistent sensor is characterized.
(146) An imaging sensor wherein the cooperative target is data linked to the satellite with the sensor, which will allow for machine to machine coordination of what is being observed and what the observation records.
(147) An imaging sensor observing and recording imagery and data of an object in the Earth's orbit, which will allow determination of characteristics of that object.
(148) An imaging, sensor observing and recording imagery and data of an object in space out of the Earth's orbit which will allow the observation and tracking, of objects other than in the Earth's orbit. An example of observations and recordings of such objects is that of tracking Near Earth Objects (NEOs), which are asteroids or comets that have sun centric orbits but have such orbits that may intersect the orbit of the Earth or nearly intersect the orbit of the Earth.
(149) A satellite has an additional image sensor as adding an additional sensor cart allow two different resolutions that can be used in conjunction to give additional imagery and data information. An example of such a satellite with an additional image sensor is a satellite that has a sensor that observes and records the full observable hemisphere and an additional sensor that provides a much narrower field of view but at a significantly higher resolution.
(150) An imaging satellite with a pointing system configured to change a position of a sensor with a different field of view with regard to the surface of the Earth so that the image sensor perceives different portions of the Earths surface when producing data or a series of images.
(151) An imaging satellite with a pointing system that includes a gimbaled set wherein an optical telescope of a narrow field or view image sensor is pointed by adjusting the angle of the telescope relative to the body of the satellite.
(152) An imaging satellite with a pointing system that includes a control mechanism configured to control an amount of spin imparted by a momentum or reaction wheel on the satellite so as to impart a relative rotation of the satellite with respect to the Earth and cause an optical path of the image sensor to change with respect to a predetermined spot on the Earth.
(153) An imaging satellite with a sensor control and data processing system that is programmed at any time through communications links from the ground control systems to produce images of fixed resolutions and frame rates allows the sensor to be commanded to produce imagery and related data at the native resolution of the FPAs and frame rates that can be compared to baseline data sets.
(154) An imaging satellite with a sensor control and data processing system that is programmed at any time through communications links from the ground control systems to produce images of variable resolutions and frame rates as variable resolutions and frame rates allow the implementation of combining the received light over multiple pixels and thus provide improved low light capabilities. An example of this use of variable resolution would be in night conditions with partial or no moonlight.
(155) A sensor control and data processing system with ground control systems monitoring the saturation and pixel received energy levels of a FPAs and modifying the FPAs' relevant operational parameters allowing the ground systems to override the semi-autonomous system onboard the satellite and associated with the sensor. Such a ground override can force the system to produce non optimized imagery for specialized purposes like characterizing degradations over the lifetime of the sensor.
(156) A sensor control and data processing system commanded from the ground control systems to produce imagery at various resolutions to minimize the amount of controlling electronics on the satellite associated with the sensor. Such a configuration can minimize satellite costs and launch mass minimizing launch costs.
(157) A processing system as commanded from the ground control systems produces imagery at fall resolution of a select area of the observable hemisphere and reduced resolution imagery of the rest of the observable hemisphere can optimize the information transmitted to the ground. For example full resolution of some areas of the Earth and lower resolution of other areas can allow for lower compression ratios and as high compression ratios are not loss less, the amount of information lost can then be minimized.
(158) A data processing system wherein the imagery of full and lesser resolution images is processed at different frame rates to optimize the imagery for transmission to the ground allowing another means other than just variable resolution to optimize the information sent to the ground.
(159) A data processing system wherein the lesser resolution imagery and full resolution imagery is electively stored for future recall allows areas of little or no change to be stored at lower resolution to minimize storage and data transport costs.
(160) A data processing system wherein the imagery of full and lesser resolution images may be combined with the imagery from a separate sensor to provide data fused images that can provide information not derivable from either sensor alone.
(161) An imaging satellite with a sensor control and data processing system that includes a data compression mechanism configured to compress the data before transmitting the data to a ground location. The data stream from the sensor described herein is huge. Compressing the imagery and data allows conventional RF or laser communications to transmit the compressed imagery and data to the ground station.
(162) A data processing system of performing loss less compression resulting in no loss of information. Some uses of the imagery and data requires the utmost in available such as measurement intelligence techniques. Lossless compression supports such uses.
(163) A data processing system of performing variable bit rate compression which can allow variable compression per circumstances as they change over time. With a finite data bandwidth for transmissions to the ground station as the frame rates change within the sensor the variable bit rate compression allows the electronics to change compression rates and still meet the finite data transmission bandwidth while compressing the imagery and data as little as practical.
(164) A data processing, system of performing lossy compression that preferentially is loss in the lower order (least significant) bits of each pixel in an image as lower order hits impart less inaccuracy into the compressed imagery and data to be transmitted to the ground.
(165) A data processing system of performing an industry standard motion imagery lossy compression algorithm as using industry standard compression can lower costs as well as take advantage of progression within the field of the mathematics of compression algorithms.
(166) An imaging satellite with a transmitter is configured to transmit the data directly to a ground station will allow imagery and related data to be transmitted continuously with absolute minimum latency to imagery and data processing systems.
(167) An imaging satellite with a transmitter is configured to transmit the data directly to a user remote location will allow transmission to a remote location where the information is most useful. An example is transmission into a military facility within an active engagement zone allowing immediate access to the imagery and related data without going through the main ground station, processing, and distribution system.
(168) An imaging satellite with a transmitter is configured to transmit the data directly to another imaging satellite to relay the imagery and data to a remote location allows transmission to ground stations not within line of sight of the satellite with the sensor on it. Such a combination of primary satellite with the sensor and transmissions through a secondary satellite can provide the imagery through downlinks to the ground directly to any place on the Earth.
(169) A imaging satellite with a transmitter is configured to transmit the data directly to a network node configured to relay said imagery and data to a remote location by way of the Internet allows transmission to the ground station placed at an environmentally optimal site while allowing data processing and distribution to be located elsewhere.
(170) An imaging satellite wherein the imaging satellite is one satellite of a constellation of at least three similar satellites in GSO which allows the full system of three satellites to cover the entire Earth except for the polar regions of the Earth.
(171) An imagine satellite constellation in which the constellation supports imaging the same point on the Earth from more than one satellite of the constellation such as with a constellation of three or more equally spaces GSO satellites in which the sensors observing the full observable hemisphere from GSO have overlapping fields of view.
(172) An imaging, satellite system with a ground processing system configured to create images and motion imagery of the imagery and data in real-time allows the imagery and data to provide useful information to the end users as the events being imaged happen.
(173) A ground processing system performs data cumulation processing on the imagery, the process of combining the imagery and data to give higher dynamic range, to produce imagery of a quality better than the raw imagery.
(174) A ground processing system that performs super resolution processing techniques, which can produce imagery of resolutions better than the raw resolution of the sensor.
(175) A ground processing system derives trend or patterns of interest from the imagery and data that can then compare the trend or patter to that of a trend or pattern stored in a database. For example this can allow notification to interested parties of changes in patterns.
(176) A ground processing system that combines imagery from more than one satellite of a constellation allowing, the imagery from multiple satellites observing the same area on the Earth to provide different views of the same area resulting in information being able to be derived that cannot be derived from any individual satellite.
(177) A ground data processing system combining imagery from more than one satellite producing higher resolution imagery than is transmitted down from a satellite as pixels of the imagery of each satellite do not precisely coincide the offset of pixels can be utilized to produce mathematical subsets of each pixel and thus effective higher resolution imagery.
(178) A ground data processing system combining imagery from more then one satellite to produce stereoscopic imagery which can provide a form of 3D processed imagery of the combined field of view.
(179) A imaging satellite system with a ground processing system configured to extract signatures of interest from of the imagery which can afford a wide array of extractions that are more expansive and changeable than doing such signatures extraction on the satellite.
(180) A ground processing system combines imagery from external sources to extract events of interest which allows access to other sources for fusion of the imagery and data to provide useful information not available from any single source.
(181) An imaging satellite system comprising a high resolution display in a ground facility that is a surface of a sphere to give a real-time representation of the Earth as events unfold.
(182) A high resolution display system that is a set of a plurality of projectors located at or near the center of the sphere to project onto the interior of the sphere which will allow the observers to move about the sphere without blocking any projectors that would otherwise be outside the sphere projecting onto its exterior surface.
(183) A high resolution display system that includes a touch interface on the surface of the sphere to allow human interaction and direction of the way the imagery and data is displayed which will allow users to select menus, display extracted data or expand resolution of selected areas on the sphere.
(184) An imaging sensor as described in the preferred embodiment that is body mounted to the satellite bus and pointed by means of repointing the satellite bus allowing stable repointing of the sensor.
(185) An imaging sensor as described in the preferred embodiment that is mounted to the satellite and pointed by means of a two axis tilt plane that is mounted between the sensor and the satellite bus allowing faster repointing than a satellite body mounted sensor.
(186) An imaging sensor as described in the preferred embodiment that is mounted to the satellite and pointed by means of a two axis tilt plane that is mounted between the sensor and the satellite bus which repoints the body of the sensor itself.
(187) An imaging sensor observing and recording imagery and data of an uncooperative target on the Earth, in the Earth's atmosphere, or in the space wherein the sensor provides independent imagery and data of that target.
(188) An imaging sensor wherein the optics includes a non imaging, total internal reflection mirror for folding the optical train to make the telescope more compact. For example, volumes on satellites are often very constrained and the optical train of a large optics with wide fields of view can be very long, longer than the volume allowed for the satellite. Using a mirror to fold the optical train allows it to be more compact.
(189) An imaging sensor with a rotating filter wheel with one or more color sections within the wheel between the telescope and the focal plane array to allow for the creation of a broad range of full color imagery while allowing the full focal plane array to record the same color.
(190) The Readout Electronics of the imaging satellite that reads out the imagery and data in parallel from each local plane array by means of one or more readout ports on each focal plane array allows the full image across multiple focal plane arrays to provide a single, consistent picture.
(191) A sensor control and data processing system that varies the frame rates of sub elements of a FPA to allow for optimal, concurrent high light imaging and low light imaging of objects and events within the field of view of the FPA under both high light and low light conditions
(192) A sensor control and data processing system within the imaging satellite that synchronizes frame rates across multiple FPAs allows a consistent image as regular intervals to be constructed across the entire field of view of the sensor.
(193) A sensor control and data processing system within the imaging satellite that synchronizes multiple frame rates that are integer multiples of each other across multiple TPAs allowing analyses and object tracking to be done across multiple frames at synchronized intervals.
(194) An image and data processing system within the satellite that characterizes levels of contrast within an FPA allowing signatures of items within the field of view of that FPA to be recorded and characterized.
(195) An image and data processing system within the satellite that characterizes levels of contrast across multiple FPAs allowing signatures of items within the field of view of FPAs to be recorded and characterized.
(196) An image and data processing system within the satellite that creates an alert that may be transmitted to the ground when contrast level differences are of pre defined levels allowing the characterization of objects within the sensor field of view to be attained on the satellite then transmitted to uses on the ground.
(197) An image and data processing system within the satellite that creates alert that may be transmitted to the ground when contrast level differences are of pre defined levels and those levels move from one FPA sub element to another allowing the characterization of the motion of objects within the a FPA of view to be attained on the satellite then transmitted to uses on the ground.
(198) An image and data processing system within the satellite that creates an alert that may be transmitted to the ground when contrast level differences are of pre defined levels and those levels move from one FPA to another FPA allowing the characterization of the motion of objects across multiple FPA's fields of view to be attained on the satellite then transmitted to uses on the ground.
(199) An image and data processing system within the satellite that creates an a that may be transmitted to the ground when areas of contrast level differences are of a pre defined shape allowing signatures of items based upon shape definition to be recorded and characterized.
(200) An image and data processing system within the satellite that characterizes levels of colors within an FPA allowing signatures of items based upon color definitions to be recorded and characterized.
(201) An image and data processing system within the satellite that characterizes levels of colors across multiple FPAs allowing signatures of items based upon color definitions to be recorded and characterized for objects of interest that is observed and recorded across multiple FPAs.
(202) An image and data processing system within the satellite that creates an alert that may be transmitted to the ground when color level differences are of pre defined levels allowing the characterization of objects within the sensor field of view to be attained on the satellite then transmitted to uses on the ground as alerts.
(203) An image and data processing system within the satellite that creates an alert that may be transmitted to the ground when areas of color level differences are of a pre defined shape allowing the characterization of objects due to color differentiation within the sensor field of view to be attained on the satellite then transmitted to uses on the ground as alerts.
(204) An image and data processing system within the satellite that creates an alert that may be transmitted to the ground when color level differences are of pre defined levels and those levels move from one FPA sub element to another allowing the characterization of objects within the sensor field of view to be attained and motion characterized due to color characteristics on the satellite then transmitted to users on the ground as alerts.
(205) An image and data processing system within the satellite that creates an alert that may be transmitted to the ground when color level differences are of pre defined levels and those levels move from one FPA to another FPA allowing alerts to be transmitted to users on the ground based upon tracking of objects based upon coloration.
(206) An imaging satellite with multiple transmitters wherein the transmitters do transmit different data rates and different encoding schemes to different ground stations allowing multiple, diverse users with different requirements to obtain tailored information.
(207) An alternate embodiment of the present invention may be described in the following table.
(208) Table Two
(209) Event happens
(210) Light travels from the event to the satellite sensor co-collimated telescopes
(211) Light is focused on the focal plane of each telescope
(212) Light is captured by the series focal plane arrays in the local plane of each telescope and read out as voltages
(213) Voltages are processed resulting in the control electronics recording multiple super saturated pixels within one or more focal plane arrays
(214) Voltages are processed as digital bits on the satellite for transmission
(215) Imagery and data are transmitted to the ground
(216) Second, time spaced event happens
(217) Focal plane array control electronics shortens the light integration time of the local plane arrays with super saturated pixels for the second image frame capture
(218) Light from the second event travels from the event to the satellite sensor co-collimated telescopes
(219) Light from the second event is focused on the focal plane of each telescope
(220) Light from the second event is captured by the series of focal plane arrays in the focal plane of each telescope and read out as voltages
(221) The ground station receives the data stream related to the first event and “unpacks” the data
(222) Voltages from the capture of the second image are processed on the satellite for transmission
(223) The second image imagery and data are transmitted to the ground
(224) The ground station sends the imagery and data of the first image to the ground processing site
(225) The ground processing site takes the raw imagery and data from the first event and turns in into composed, user observable images
(226) The ground station sends the imagery and data of the second event to the ground processing site
(227) In parallel to the creation of composed, user observable images, the ground processing site extracts selected data from the first image
(228) In parallel to the creation of composed, user observable images, the ground processing site extracts selected data from the second image
(229) The composed images are transferred to storage (for future use) and the distribution system
(230) The distribution system (Internet, other communications satellites, etc.) delivers the live images and related, extracted data to the user
(231) Electronics controlling the processing of the imagery and associated data from the sensor may perform extractions of information derived from the imagery and data. The electronics may extract such information as:
(232) Position based change detection which is allowed due to the stationary sensor position of the sensor in the GSO or the slowly moving sensor of the Molniya, HEO or Lagrange orbits.
(233) Motion based change detection which is specifically supported by the ability to do persistent imaging of each of these orbits.
(234) Activities based change detection which is afforded by the combination of both motion based change detections and position based change detection over time supported by the persistent sensors.
(235) Behavior based change detection which is a specific step combining and comparing activities based change detection with external data accumulated by means of persistent imaging afforded by a staring sensor in GSO, Molniya, HEO or Lagrange orbit.
(236) Location based signatures extraction is a step of obtaining a specific set of colors or patterns that matches a pre defined set of criteria of interest, which is made possible by persistent imaging afforded by a staring sensor in GSO, Molniya HEO or Lagrange orbit, and
(237) Combined signatures extraction and change detection is the combination of the two when a specific signature is obtained or when a specific change is detected, which is made possible by persistent imaging afforded by a staring sensor in GSO, Molniya, HEO or Lagrange orbit.
(238) All of these detections are directly supported by the ability to do long duration persistent imaging afforded by GSO, Molniya, HEO, and Lagrange orbits.
(239) V. Glossary
(240) 2D Cylindrical Projection
(241) A two dimensional (2D) Cylindrical Projection of the Earth is an equirectangular projection and is a very common map projection in which the surface elements of the Earth's surface are depicted as if drawn onto a cylinder that has as its main axis the same as the axis of rotation of the Earth. This results in a two dimensional rectangular map in which the meridians are at right angles with the polar regions stretched with regard to the equatorial regions.
(242) Apogee
(243) The point in a satellite's orbit when the satellite is farthest from the surface of the Earth.
(244) ASIC, Application Specific Integrated Circuit
(245) An integrated circuit (IC) designed, customized, and implemented for a particular use, rather than a generic designed IC intended for general-purpose use.
(246) Bit Stream Compressor
(247) A system by which a serial stream of bits can be reduced in the number of bits.
(248) Bit Stream Interleaver
(249) A system by which a serial stream of bits are re-ordered from the original order.
(250) Bit Transitions
(251) The change in bits, i.e., the changes in voltage or current levels to signify individual bits within a string of bits.
(252) Block Up Converter
(253) A system by which an input signal's frequency is changed to a higher frequency while maintaining the frequency range of the original signal.
(254) Brightness Track and Hold Electronics
(255) A system that accepts a brightness level and stores that in an electronic element.
(256) CCD, Charge Coupled Device Array
(257) An image sensor implementing the movement of charge resulting from light falling onto the sensor's surface to another area of the device for readout.
(258) CMOS Array, Complementary Metal-Oxide-Semiconductor Array
(259) An image sensor implementing, created using the same technologies as used to create typical integrated circuits (IC) chips.
(260) Co-collimated Telescopes
(261) A set of two or more telescopes wherein all telescopes in the set are continuously pointed at, and focused on, the exact same region or object.
(262) Collimated
(263) Elements that are perfectly parallel.
(264) Continuous Imaging
(265) Continuous Imaging is ability to create images of a given area, region, or Object over very long, periods of days or weeks or months with no more than 30 second gaps between successive images during that entire very long period.
(266) Control Frame Rate
(267) The ability to modify and regulate the Frame Rate of a sensor.
(268) Control Integration Time
(269) The ability to modify and regulate the period of radiation capture by a sensor.
(270) Controlled Readout Electronics
(271) The system by which the retrieval of information from a sensor is modified and regulated.
(272) Controlling Sub Elements
(273) A process for modifying and regulating the processes within a par of a sensor.
(274) Conversion Trigger Electronics
(275) An system to produce the electronic notice for a sensor to change captured energy to electrical values that can be readout.
(276) Cooperative Target
(277) Any object over which the operators of the satellite and sensor can have direct or indirect control.
(278) Data Cumulation
(279) A method of combining the data from multiple images to enhance the signal to noise ratio in the resulting image.
(280) DSB-S2x, Digital Video Broadcasting Satellite, Second Generation Extended
(281) An industry standard Radio Frequency (RE) waveform modulation that implements a spectrally efficient, very high data and information rate communications.
(282) Earth Observation
(283) Earth Observation is the viewing and recording of imagery and related data of the Earth and the general space about the Earth.
(284) Electronically Controllable High Refractive Index Material
(285) A material that changes its index of refraction, and by this the path of light going through it, based upon the electrical voltage levels placed across the material.
(286) Encryptor
(287) A system in which bits are reordered in a pre defined pattern in order for the output bits to be unintelligible to any entity reviewing the output bits that does not know the pre defined pattern.
(288) Error Correction Bits
(289) Additional bits added to a serial stream that allows the original stream to be recovered in case of errors in the transmission of the bit stream.
(290) Field of View
(291) The field of view (FOV) of a satellite or sensor is the area able to be observed by the satellite or sensor without the satellite being reoriented or repointed.
(292) FPGA, Field Programmable Gate Array
(293) An integrated circuit designed to be configured by a customer or a designer after manufacturing—hence “field-programmable”. The FPGA configuration is generally specified using a hardware description language (HDL). FPGAs can have their “programming” changed even after they are in an operational system such as a satellite already on orbit.
(294) Focal Plane
(295) The surface onto which an optical system, e.g., a telescope, focuses an image. A Focal Plane may be an imaginary surface in three dimensional space or the surface may be composed of one or more FPAs.
(296) Focal Plane Array (FPA)
(297) An array of elements, typically but not exclusively, within a single integrated circuit chip, designed to receive tight and convert that light into electrical charges that can then be read out.
(298) Fractionated Sensor
(299) A fractionated sensor is one in which the focal plane arrays (FPAs) are disbursed across multiple optical elements and multiple focal planes while observing and recording a single field of view through every optical element. The optical elements may be optical telescopes over the wavelengths of ultraviolet through visible through infrared.
(300) Frame Rate
(301) The number of images (frames) recorded and read out by the image sensor per second. For example: the standard frame rate in the U.S. for “full HDTV” is approximately 30 frames per second.
(302) Geostationary Orbit
(303) The orbit in the same plane as the Earth's equator at which an object in this orbit revolves around the Earth at the exact same rate as the Earth rotates on its axis. This is approximately 35,786 km above the Earth's surface at the equator. A Geostationary Orbit is a special case of a geosynchronous orbit.
(304) Geostationary Orbit Plane
(305) The Geostationary Orbit Plane is the plane that passes through the center of the Earth and extends through the equator of the Earth and out into space.
(306) Geostationary Observable Hemisphere
(307) The Geostationary Observable Hemisphere is the Satellite Observable Footprint that is able to be viewed from a satellite in geostationary orbit.
(308) Geosynchronous Orbit (GEO)
(309) The orbit in which an object, such as a satellite, takes 24 hours to revolve around the Earth. This orbit has an object, e.g., a satellite, pass over the same point on the surface of the Earth at the same time every day.
(310) Highly Elliptical Orbit (FIFO)
(311) A highly elliptical orbits one in which the eccentricity of the orbit is greater than 0.5.
(312) High Index of Refraction Material
(313) A high index of refraction material is one in which the refractive index is equal to or greater than 1.3.
(314) Highly Refractive Index Plate
(315) A plate comprised of a High Index of Refraction Material
(316) Imagery
(317) The picture represented by the array of pixels created by a sensor.
(318) Inclined Orbit
(319) An Inclined Orbit is one in which the plane of the orbit is not coincident with the Earth's equatorial plane.
(320) Individual Port Readouts
(321) The readout of bits from one of the readout points of a sensor.
(322) Instantaneous Field of View
(323) The instantaneous field of view (IFOV) is the section of space a satellite or sensor observes at a single moment in time. For scanning sensors the IFOV is moved to slowly cover the full field of view (FOV) of the satellite or sensor. For staring sensors or satellites with staring sensors like the satellite and sensor system in this patent submission, the IFOV and the FOV are the same.
(324) Integration Time
(325) The duration in which a sensor captures radiation, typically light or infrared radiation.
(326) Interleaved
(327) A re-ordered set of bits as different from the original ordering of the bits.
(328) Lagrange Point
(329) A Lagrange Point is one of the five points defined by gravitational equilibrium points for a two body system in which one body is significantly larger than the other and the smaller body is orbiting the larger body, e.g., the Earth and the Moon. The L1, L2, and L3 points are unstable gravitational points wherein if an object moves a small distance from any of those points it will continue to move away from that point. The L4 and L5 points are moderately stable points wherein if an object moves a small distance from that point it will return to the vicinity of the L4 or L5 point.
(330) Last Frame Data Element
(331) A piece of data that is part of the information in the previous imagery frame. Examples could be the maximum light level, the minimum light level, the integration duration, or the temperature of the focal plane array.
(332) Light Beams
(333) Elements of the light that are being collected by the telescope and focused on the focal plane of that telescope.
(334) Low Earth Orbit (LEO)
(335) Low Earth orbit is defined as low eccentricity orbits (eccentricity is <0.1) with an apogee less than 1,200 km above the surface of the Earth.
(336) Maximum Light Level
(337) The highest level Of light impinging on a sensor element.
(338) Mirror
(339) A reflective element in an optical path.
(340) Modulator
(341) An electronic element that takes input signals, typically digital signals but possibly analog signals, and transforms them into a series of transitions that can be transmitted over a radio frequency link.
(342) Molniya Orbit
(343) A Molniya orbit is a special case of a highly inclined, highly elliptical orbit. The Molniya orbit has an orbit period of 12 hours, which allows observations from a satellite in that orbit of the polar regions for approximately eight of each 12 hour period, and passes over the same region of the Earth every 24 hours.
(344) Motion Change Detector
(345) An electronic system that characterizes the differences from one image and data set to another image and data set as those changes are from one set of pixels to a different set of pixels even if those sets of pixels overlap.
(346) Motion Trend Comparator
(347) An electronic system that characterizes the patterns that can be represented by a series of changes in images and data of objects as those objects move within the field of view.
(348) Nadir
(349) Nadir is defined as the direction of the center of the Earth from the viewpoint of the satellite. An object or apparatus pointed toward nadir is pointed toward the center of the Earth.
(350) Notification Format Data Element
(351) A set of data that specifies the manner in which events are reported.
(352) Observable Area Footprint
(353) The observable area footprint is that region of the surface of the Earth that is observable by a satellite or the sensor aboard that satellite.
(354) Observable Hemisphere
(355) The observable hemisphere is that region of the Earth, the atmosphere above it, and the space above that atmosphere that is observable by a satellite or the sensor aboard that satellite. Because of the geometric limitations of being a finite distance from the surface of the Earth, the observable hemisphere is less than the physical hemisphere of the Earth, i.e., less than 50% of the Earth. The farther the satellite is from the Earth the higher the relative fraction of the true, 50% hemisphere the observable hemisphere becomes, i.e., the observable hemisphere for a satellite at geostationary orbit is smaller than the observable hemisphere for a satellite at either of the L4 or L5 Lagrange points.
(356) Optimized Image Capture
(357) Data of observations is represented by images and related data. Optimized images have a minimized set of noise throughout the image and the greatest signal to noise ratio throughout the image.
(358) Perigee
(359) The point in an orbit in which the object in the orbit is closest to the center of the Earth.
(360) Persistent Imaging
(361) Persistent Imaging is defined as there being a gap between successive images of the same area of the Earth or an object in the vicinity of the Earth of no more than 30 seconds independent of the time of day or day of year with the imaging being maintained over long durations.
(362) Persistent Imaging versus Continuous Imaging
(363) Persistent Imaging does not require Continuous Imaging. Persistent Imaging may create images with gaps of no more than 30 seconds for extended periods but not continuously over very long periods. For example, Persistent Imaging of a satellite in is Molniya Orbit may provide imaging of an Earth polar region over 12 or more hours with no greater than 30 seconds between images during that 12 or more hour period but not over 24 hours of each and every day which would be Continuous Imaging.
(364) Pixel
(365) A picture element within an image.
(366) Post Processing
(367) Electronic handling of images and related data that happens after the image is read out of the focal plane array.
(368) Radiation
(369) Radiation is any combination of light, free electrons, like protons, free neutrons, x-rays, or gamma-rays that may impinge on a system.
(370) Reference Database FPGA
(371) A field programmable gate array (FPGA) that stores data that can be read out at a later time as well as updated by loading new firmware into the FPGA.
(372) Real-time
(373) Real-time is defined as the end user being able to see an event in less than thirty seconds after an event occurs, including the aggregate of durations for:
(374) light to get from the event to the sensor,
(375) the sensor to capture the light and create an electronic image, the sensor readout, the read out data processing,
(376) the transmission to a ground processing site,
(377) ground processing,
(378) distribution to the end user, and
(379) end user display.
(380) Real-time v. Live
(381) Live is defined as the end user being able to see an event in less than ten seconds after an event occurs. Real-time is defined as the end user being able to see an event in less than thirty seconds after an event occurs, including the aggregate of all durations for:
(382) light to get from the event to the sensor,
(383) the sensor to capture the light and create an electronic image,
(384) the sensor readout,
(385) the read out data processing,
(386) the transmission to a ground processing site,
(387) ground processing,
(388) distribution to the end user, and
(389) end user display.
(390) Refractive Element
(391) A refractive element is one in which the direction of the light going through the component changed. One example of a refractive element is a focusing lens.
(392) Reset Electronics
(393) Reset electronics are components that reset other electronics to a specified condition. One example of reset electronics is the set of electronics that cause focal plane arrays to go to a state with zero electrons captured allowing, a new image to be captured.
(394) Satellite
(395) An object that orbits another object. A common usage of the term satellite is that of a man-made object in orbit about the Earth.
(396) Satellite Bus
(397) The Satellite Bus is the entire structure and equipment making up the satellite except for the sensors and equipment and electronics exclusive to those sensors.
(398) Satellite Sensor Footprint
(399) The Satellite Footprint is the area being observed by a satellite's sensor at a given point in time. The Satellite Sensor Footprint is less than or equal to the Satellite Observable Footprint.
(400) Satellite Observable Footprint
(401) The Satellite Observable Footprint is the maximum area on the surface of the Earth that is viewable from a satellite at a given point in time.
(402) Scanning Sensor
(403) A Scanning Sensor is a sensor that accumulates imagery as the field of view of the sensor moves across the area of interest. A scanning sensor accumulates imagery as the sensor observation and recording area is moved across the area or object of interest. Then, in order to create a two dimensional image larger than the observation region of the scanning sensor the field of view of the sensor is moved in a direction perpendicular to the first dimension of motion and retraces to the beginning point of the original motion of the scan. The process is repeated until the full area or object of interest is observed and recorded.
(404) Sensor
(405) An electronic and optical system used to observe and record images and data of the field of interest.
(406) Sensor Image Processing Compression
(407) A set of electronic elements that takes advantage of redundant data within an image and related data to utilize known algorithms to represent the original data set of a given number of hits with another data set of a smaller number of bits.
(408) Sensor Image Readout
(409) A set of electronic elements that extract the information in the sensor for transfer to other electronics for processing.
(410) Serial Bit Stream Serializer
(411) A set of electronic elements that ingest multiple bit streams in a parallel manner then coverts them to one or more serial streams.
(412) Software Controlled Processing Electronic Elements
(413) Electronic elements that process information based upon software loaded into those electronics.
(414) Sparsely Populated
(415) A set of components that do not form a physically contiguous set. One set of sparsely populated elements can be a set of focal plane arrays (FPAs) that are set into a single focal plane but are set apart from each other and thus not forming a single, contiguous set.
(416) Staring Sensor
(417) A staring sensor is a combination of optics, e.g., refractive telescope, reflective telescope, or catadioptric telescope and recording elements, e.g. focal plane arrays that is continuously pointed toward and observing a given object or space, i.e., staring at that given object or space. A staring sensor generally observes and records essentially the entire observable region or object at generally the same time.
(418) Stitched Together
(419) A method in which separately recorded imagery elements are combined to create a larger image or data set.
(420) Terminator
(421) The terminator with regard to the Earth and the Earth's lighting is the region on the Earth and within the Earth's atmosphere that encompasses the region and volume of the transition from day lighting to night lighting.
(422) Uncooperative or Non Cooperative Target
(423) Any object over which the operators of the satellite and sensor cannot have direct or indirect control.
SCOPE OF THE CLAIMS
(424) Although the present invention has been described in detail with reference to one or more preferred embodiments, persons possessing ordinary skill in the art to which this invention pertains will appreciate that various modifications and enhancements may be made without departing from the spirit and scope of the Claims that follow. The various alternatives for providing a Real-time Satellite Imaging System have been disclosed above are intended to educate the reader about preferred embodiments of the invention, and are not intended to constrain the limits of the invention or the scope of Claims.
LIST OF REFERENCE CHARACTERS
(425) ED Earth's disc
(426) ES Earth's surface
(427) FP Focal plane
(428) FPA Focal plane array
(429) IO Imaging optics
(430) L Light rays
(431) SC 1 First scan
(432) SC 2 Second scan
(433) SC 3 Third scan
(434) SC 4 Fourth scan
(435) SM Scanning mirror
(436) SC N Nth Scan
(437) SR Scan retrace
(438) SO Moving object being scanned
(439) SC A First scan of the moving object
(440) SC B Second scan of the moving object
(441) SC C Third scan of the moving object
(442) SC D Fourth scan of the moving object
(443) OM Direction of motion of the moving object
(444) 10 Satellite
(445) 12 Real-lime Satellite Imaging System
(446) 14 Image Sensor
(447) 15 Solar Arrays
(448) 16 Transmitter
(449) 17 One embodiment of image sensor
(450) 18 Co-collimated telescopes
(451) 20 First refractive elements
(452) 22 Focal plant
(453) 24 Focal plane array
(454) 24AA First focal plane array in the first focal plane
(455) 24AB Second focal plane array in the first focal plane
(456) 24AC Third focal plane array in the first focal plane
(457) 24AD Fourth focal plane array in the first focal plane
(458) 24AE Fifth focal plane array in the first focal plane.
(459) 24AF Sixth focal plane array in the first focal plane.
(460) 24AG Seventh focal plane array in the first focal plane.
(461) 24AH Eighth focal plane array in the first focal plane
(462) 24AI Ninth focal plane array in the first focal plane
(463) 24BA First focal plane array in the second focal plane
(464) 24BB Second focal plane array in the second focal plane
(465) 24BC Third focal plane array in the second focal plane
(466) 24BD Fourth focal plane array in the second focal plane
(467) 24BE Fifth focal plane array in the second focal plane
(468) 24BF Sixth focal plane array in the second focal plane
(469) 24BG Seventh focal plane array in the second focal plane
(470) 24BH Eighth local plane array in the second focal plane
(471) 24BI Ninth focal plane array in the second focal plane
(472) 24CA First focal plane array in the third focal plane
(473) 24CB Second focal plane array in the third focal plane
(474) 24CC Third focal plane array in the third focal plane
(475) 24CD Fourth focal plane array in the third focal plane
(476) 24CE Fifth focal plane array in the third focal plane
(477) 24CF Sixth focal plane array in the third focal plane
(478) 24DA First focal plane array in the fourth focal plane
(479) 24DB Second focal plane array in the fourth focal plane
(480) 24DC Third focal plane array in the fourth focal plane
(481) 24DD Fourth focal plane array in the fourth focal plane
(482) 24DE Fifth focal plane array in the fourth focal plane.
(483) 24DF Sixth focal plane array in the fourth focal plane
(484) One embodiment a of single co-collimated telescope
(485) 26a Reflective element, primary mirror
(486) 26b Reflective element, secondary mirror
(487) 28 Second refractive elements
(488) 30 Alternative embodiment of a single co-collimated telescope
(489) 31 Telescope aperture
(490) 32 Folding mirrors
(491) 34 Overlap regions of sparsely populated focal plane arrays
(492) 36 Imagery and Data Flow Functional Block Diagram
(493) 38 Sensor image controller
(494) 40 Software controlled electronics
(495) 42 Sensor image readout
(496) 44 Imagery and Data
(497) 46 Software controlled readout electronic elements
(498) 48 Image processor
(499) 50 Sensor Image Readout Data including Maximum Light Level
(500) 52 Software controlled processing electronic elements
(501) 54 Processed Images
(502) 56 Sensor Control Functional Block Diagram
(503) 58 Focal Plane Array Subelement
(504) 60 Sensor Image Controller Field Programmable Gate Array (FPGA)
(505) 62 Integration Time per ERA control element.
(506) 64 Frame Rate of FPA control element
(507) 66 Integration Time per subelement of FPA control element
(508) 68 Frame Rate of sub element of FPA control element
(509) 70 Sensor Image Readout Block Diagram
(510) 72 Readout Electronics Field Programmable Gate Array (PGA)
(511) 74 Sensor Image Readout element sensing Maximum Light Level
(512) 76 Sensor Image Readout Transmitting Maximum Light Level to Sensor Image Control
(513) 78 Image Processor Functional Block Diagram
(514) 80 Sensor image processing compression
(515) 82 Sensor image processing formatting for transmission
(516) 84 View of use of reactive plate
(517) 86 High index of Refraction Plate
(518) 88 Incident light beams
(519) 90 Multiple paths of light beams
(520) 92 Singly shifted exiting light rays
(521) 94 Voltage control element
(522) 96 First schematic diagram of plate pair
(523) 98 Second schematic diagram of plate pair
(524) 100 Singly shifted exiting light rays
(525) 102 Doubly shifted exiting light rays
(526) 104 Schematic diagram of imaging sensor
(527) 106 Functional Block Diagram of Image Sensor Control and Readout and Processing
(528) 108 Geostationary Orbit (GSO) arc
(529) 110 Field of View of the Geostationary Orbit Satellite
(530) 112 Observable Surface and Atmosphere from GSO
(531) 114 The Earth
(532) 116 Overlapping areas of observation areas of GSO satellites as seen from a north pole vantage point
(533) 118 2D Cylindrical Projection of the Earth
(534) 120 Geostationary Satellite Observable Area Footprint
(535) 122 Observable Area-Overlap of the three equally spaced GSO satellites
(536) 124 Molniya Orbit Arc
(537) 126 Field Of View of the Molniya Orbit Satellite with the satellite at apogee
(538) 128 Observable Area of the Satellite at apogee with the satellite in a Molniya Orbit
(539) 130 Southern edge of Molniya observable footprint of a Molniya orbit satellite at apogee
(540) 132 Ground Track of the Molniya Orbit
(541) 134 Molniya orbit satellite #1
(542) 136 Molniya orbit satellite #2
(543) 138 Molniya Orbit Arc of two satellites.
(544) 139 Molniya Orbit satellite #1 field of view
(545) 140 Molniya Orbit Satellite #1 observable footprint
(546) 141 Molniya Orbit satellite #2 field of view
(547) 142 Molniya Orbit Satellite #2 observable footprint
(548) 144 Ground Track of the Molniya Orbit Satellite #1
(549) 146 Ground Track of the Molniya Orbit Satellite #2
(550) 148 Coverage Area within the observable footprint of Molniya Satellite #1
(551) 150 Coverage Area within the observable footprint of Molniya Satellite #2
(552) 152 Molniya Orbit satellite #3
(553) 154 Molniya Orbit satellite #4
(554) 155 Molniya Orbit satellite #3 field of view
(555) 156 Coverage Area within the observable footprint of Molniya Satellite #3
(556) 157 Molniya Orbit satellite #3 field of view
(557) 158 Coverage Area within the observable footprint of Molniya Satellite #4
(558) 160 Overlap of Coverage Areas of observable footprints of Molniya Satellites #1 and #2
(559) 162 Overlap of Coverage Areas of observable footprints of Molniya Satellites #3 and #4
(560) 164 Observable Area of North Molniya Satellite at Apogee
(561) 166 Observable Area of South Molniya Satellite at Apogee
(562) 168 Observable Area of Western Pacific Geostationary Satellite
(563) 170 Observable Area of Western Hemisphere Geostationary Satellite
(564) 172 Observable Area of European/African Hemisphere Geostationary Satellite
(565) 174 Area of Maximum Overlap of Observations with 4 satellites observing at the same time
(566) 176 Highly Elliptical Orbit Are
(567) 178 Field of View of a Highly Elliptical Orbit Satellite
(568) 180 Observable Area of a Flighty Elliptical Orbit Satellite
(569) 182 Satellite in a 24 hour Highly Elliptical Orbit
(570) 184 Ground Track for a 24 hour Highly Elliptical Orbit satellite
(571) 186 Southern Edge of an Observable Footprint of a 24 hour Highly Elliptical Orbit satellite
(572) 188 Field of View of the Earth of a satellite at a L4 or L5 Lagrange Point
(573) 190 Equilateral Triangles approximating positions of the L4 & L5 Lagrange Points
(574) 192 Moon
(575) 194 Observable Area of a satellite at the L4/L5 Lagrange Points
(576) 196 Individual Pixels
(577) 198 Focal Plane Array Sub Element Consisting of Multiple Pixels
(578) 200 Reset Electronics Controlling FPA Sub Element
(579) 202 Light integration Timer Controlling FPA Sub Element
(580) 204 Analog to Digital Conversion Trigger Electronics Controlling FPA Sub Element Analog to Digital Conversions
(581) 206 Readout Trigger Electronics to Trigger both FPA Sub Elements and Readout Electronics
(582) 208 Readout Electronics Reading FPA Port #1
(583) 210 Readout Electronics Reading FPA Port #2
(584) 212 Image Formatting Element
(585) 214 Formatted Imagery Transferred to Image Processing Element
(586) 216 Integration Timing Data to Image Processing Electronic Element
(587) 218 Contrast Detection within each FPA Sub Element
(588) 220 Color Detection within each FPA Sub Element
(589) 222. Change Detection within each FPA Sub Element
(590) 224 Contrast Detection across FPA sub elements within each FPA
(591) 226 Color Detection across FPA sub elements within each FPA
(592) 228 Change Detection across FPA sub elements within each FPA
(593) 230 Contrast Detection across multiple FPAs
(594) 232 Color Detection across multiple FPAs
(595) 234 Change Detection across multiple FPAs
(596) 236 Parallel to Serial Bit Stream Serializer
(597) 238 Serial Bit Stream
(598) 240 Bit Stream Compressor
(599) 242 Bit Stream Interleaver
(600) 244 Bit Stream Encryptor
(601) 246 Forward Error Correction
(602) 248 Modulator
(603) 250 Modulated Analog, Intermediate Frequency Signal
(604) 252 Block Upconverter
(605) 254 Radio Frequency Signal
(606) 256 Power Amplifier
(607) 258 Antenna
(608) 260 Pixel Readout Electronic Element
(609) 267 Pixel Data
(610) 264 Max Brightness Track and Hold
(611) 266 Brightness Level Detection Element
(612) 268 Brightness versus Saturation Level Detection
(613) 270 Saturation Level Set
(614) 272 Next frame Integration Duration Set
(615) 274 FPA Integration Duration Set
(616) 276 FPA In Set Signal
(617) 278 Conversion Trigger Signal
(618) 280 Readout Trigger Signal
(619) 282 Image Processing FPGA
(620) 284 Static Position Change Detector
(621) 286 Motion Change Detector
(622) 288 Image Comparator
(623) 290 Last Frame Data Element
(624) 292 Reference Data Base FPGA
(625) 294 Notification Set Generation Element
(626) 296 Notifications
(627) 298 Notification Format Data Element
(628) 300 Position Comparator
(629) 302 Location Data Element
(630) 304 Signature Comparator
(631) 306 Signature Data Element
(632) 308 Next Frame Integration Time Comparator
(633) 310 Noise & Dark Current Time Related Data element
(634) 312 Motion Image Comparator
(635) 314 Motion Image Data element
(636) 316 Motion Notification Set Generation element
(637) 318 Motion Notification Format Data
(638) 320 Motion Signature Comparator
(639) 322 Motion Signature Data element
(640) 324 Motion Trend Comparator
(641) 326 Motion Trend Data element
(642) 328 Motion integration Time Comparator
(643) 330 Noise & Dark Current Motion Related Data element