NAVIGATION APPARATUS AND POSITION DETERMINATION METHOD
20230087890 · 2023-03-23
Inventors
Cpc classification
G01S13/874
PHYSICS
International classification
Abstract
A navigation apparatus includes an image capturing device, template database, correlation device, evaluation device, and output interface. The image capturing device can create a radar image of a surround, the template database configured to provide at least one template substantially matched to the radar image and containing at least one geo-referenced landmark, the at least one geo-referenced landmark being geo-referenced by at least one geo-coordinate. The correlation device can correlate the at least one geo-referenced landmark in the at least one template with the radar image and provide the at least one geo-coordinate belonging to the at least one geo-referenced landmark. The evaluation device can determine a position of the navigation apparatus from the at least one geo-coordinate of the at least one geo-referenced landmark and from a setting of the image capturing device. The output interface is configured to provide the determined position.
Claims
1. A navigation apparatus, comprising: an image capturing device; a template database; a correlation device; an evaluation device; and an output interface; wherein the image capturing device is configured to create a radar image of a surround; wherein the template database is configured to provide at least one template which is substantially matched to the radar image and which contains at least one geo-referenced landmark; the at least one geo-referenced landmark being geo-referenced by at least one geo-coordinate; wherein the correlation device is configured to correlate the at least one geo-referenced landmark present in the at least one template with the radar image and to provide a position of the at least one geo-referenced landmark in relation to the radar image; wherein the evaluation device is configured to determine navigation information of the navigation apparatus from the position of the at least one geo-referenced landmark in relation to the radar image, from the at least one geo-coordinate of the at least one geo-referenced landmark, and from image capturing metadata of the image capturing device; and wherein the output interface is configured to provide the determined navigation information.
2. The navigation apparatus of claim 1, wherein the template database is configured to normalize the at least one provided template in relation to the radar image.
3. The navigation apparatus of claim 2, wherein the normalizing is implemented by taking account of a distortion present in the radar image in the at least one template.
4. The navigation apparatus of claim 2, wherein the normalizing includes taking account of a terrain profile of the template and/or testing of hypotheses about a pose of the at least one template in relation to the radar image.
5. The navigation apparatus of claim 1, wherein the template database is configured to provide at least one metadatum which belongs to the at least one geo-referenced landmark.
6. The navigation apparatus of claim 1, wherein the output interface comprises a navigation system and/or an image processing device for fusing the navigation information.
7. The navigation apparatus of claim 1, wherein the image capturing metadata contain a squint angle, and Wherein the evaluation device is configured to correct the squint angle based on an altitude error.
8. The navigation apparatus of claim 7, wherein the evaluation device is configured to access elevation information of the landmark derived from the at least one metadatum, a measured altitude, and/or an ascertained altitude for correcting an error of the squint angle on account of an assumed reference elevation of a terrain profile.
9. The navigation apparatus of claim 1, wherein the image capturing device is an SAR system.
10. The navigation apparatus of claim 1, wherein the evaluation device is configured to ascertain a statistical error of the navigation information.
11. An aircraft having a navigation apparatus of claim 1.
12. A position determination method comprising: creating a radar image of a surround; acquiring image capturing metadata of an image capturing device used to record the radar image of the surround; providing at least one template which is substantially matched to the radar image and which contains at least one geo-referenced landmark; wherein the at least one geo-referenced landmark is geo-referenced by at least one geo-coordinate; correlating the at least one geo-referenced landmark present in the at least one template with the radar image; providing a position of the at least one geo-referenced landmark in relation to the radar image; determining navigation information from the position of the at least one geo-referenced landmark in relation to the radar image, from the at least one geo-coordinate of the at least one geo-referenced landmark, and from image capturing metadata of the image capturing device; and providing the determined navigation information.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0119] Exemplary embodiments are described in more detail below with reference to the appended drawings. The illustrations are schematic and not to scale. Identical reference signs refer to identical or similar elements. In the drawings:
[0120]
[0121]
[0122]
[0123]
[0124]
[0125]
[0126]
[0127]
[0128]
[0129]
[0130]
DETAILED DESCRIPTION
[0131]
[0132] The navigation apparatus 100 comprises an image capturing device 101, a template database 102, a correlation device 103, an evaluation device 104, and an output interface 105, with the image capturing device 101 being configured to create a radar image 106 of a surround. In this case, the radar image 106 may be a juxtaposition of a multiplicity of SAR strips 106a in the case of a stripmap mode and the radar image 106 directly in the spotlight mode. The radar image 106, for example an SAR radar image 106, comprises a multiplicity of scanned pixels, which are addressed by pixel coordinates.
[0133] Initially, SAR strips 106a generated by the SAR raw data acquiring device 101a or by the SAR sensor 101a are transmitted together with image capturing metadata XX or settings metadata XX to the evaluation device 104. The image capturing metadata XX or settings metadata XX substantially include a setting of the image capturing device 101. These settings metadata XX may provide at least some of the relative geometry used for imitating the distortion.
[0134] In addition to the template database 102 and the correlation device 103, the evaluation device 104 comprises further submodules 104a, 104b, 104c, which may be realized as hardware and/or software modules, and which adopt specific tasks of the evaluation device 104.
[0135] By way of example, the image generating device 104b receives the strip individual images 106a and assembles these to form an SAR image 106b. Moreover, it also receives the image capturing metadata XX.
[0136] All modules the navigation apparatus 100 have a redundant design in an example. This can reduce the outage probability of the navigation apparatus 100.
[0137] The template database 102 is configured such that it provides at least one template 107 substantially matching the radar image 106a, 106b, or a matching template 107 together with at least one geo-referenced landmark 501 and, in particular, the landmark metadata YY thereof. At least one geo-referenced landmark 501 and/or an associated template 107 can be selected by the feature selection device 104a. A piece of landmark meta-information YY from a plurality of available pieces of landmark meta-information YY may be assigned to the geo-referenced landmark 501.
[0138] The at least one geo-reference landmark 501 is geo-referenced by way of at least one geo-coordinate. The geo-referenced landmark 501 can be transmitted to the image navigation device 104c together with further requested metadata YY as landmark metadatum YY and/or as landmark metadata YY.
[0139] With the aid of the correlation device 103, the image navigation device 104c is able to correlate the template 107, in particular the at least one geo-referenced landmark 501 present in the at least one template 107, with the radar image 106c and to provide the at least one geo-coordinate belonging to the at least one geo-referenced landmark 501. As a result of the correlation, the geo-coordinate of the geo-referenced landmark 501 can be linked with a pixel coordinate from the radar image 106.
[0140] For correlation purposes, a homographic transformation derived from the SAR radar image 106b prepared by the image generating device 104b may be applied to the provided template 107 in an example, with common features in the template 107 and the radar image 106 being found in this way. The image 106c processed in this way may highlight common features, for example a landmark 501 present both in the SAR radar image 106 and in the template 107.
[0141] In the processed image 106c, the template 107 and the SAR radar image 106b have been overlaid so that a geo-reference landmark 501 present in both images is located on top of itself, as depicted in
[0142] An own position, that is to say navigation information, in particular the position at which the navigation apparatus 100 is located, can be determined from the geo-referenced landmark 501 and the associated image capturing metadata XX of the radar image in combination with further measured values, for example an altitude measured value, which may be supplied by the onboard navigation system 108. In a further example, the range and angle information contained in the metadata XX of the radar image 106b which is available in relation to the geo-referenced point after the correlation method, for example a squint angle, is directly processed in the navigation filter. From a multiplicity of metadata and/or measurement values, a navigation filter selects the ones currently required. A navigation filter may be fixedly set and select a specifiable group of data. In an example, a navigation filter may fuse navigation information from the evaluation device 104 and/or from the navigation system 108.
[0143] Expressed differently, the evaluation device 104 is configured to determine a position of the navigation apparatus 100 from the at least one geo-coordinate of the at least one geo-referenced landmark 501 and from a setting of the image capturing device XX, which may have been provided as an image capturing metadatum XX for example. The geo-coordinate may be provided as a landmark metadatum YY.
[0144] This determined and/or ascertained position and/or the correlated metadata of the radar image 106b, in particular the range and angle information related to the geo-referenced point, can be provided together with the geo-coordinates of the geo-referenced point at the output interface 105 in a specifiable data format.
[0145] This may mean that the correlation device 103 is configured to correlate the at least one geo-referenced landmark 501 present in the at least one template 107 with the radar image 106b and to provide a position of the at least one geo-referenced landmark 501 in relation to the radar image 106b.
[0146] The result of the correlation is a pixel coordinate of the SAR image 106b, with the pixel coordinate corresponding to the geo-referenced landmark 501.
[0147] This knowledge allows the meta-information XX or settings data of the SAR sensor 101a, for example the squint angle and the measured variables thereof, for example a range measurement, to be related to the geo-referenced landmark 501 in order thus to obtain navigation information relating to the own position.
[0148] The evaluation device 104 is configured in such a way that it determines navigation information of the navigation apparatus 100 from the position of the at least one geo-referenced landmark 501 in relation to the radar image 106b, that is to say the pixel coordinate, from the at least one geo-coordinate of the at least one geo-referenced landmark 501, and from image capturing metadata XX of the image capturing device 101.
[0149] The output interface 105 is configured to provide the determined navigation information internally or externally.
[0150] In the example depicted in
[0151] Should the position data be provided by way of the interface 105, the navigation apparatus 100 can substantially replace a GNSS system and can adopt the function of a position sensor.
[0152] Should there be an outage of or interference in a GNSS, the navigation can be continued in this way with the aid of the navigation apparatus 100 on account of the similar information content.
[0153] The output interface 105 is essentially connected to a navigation module 109 of the navigation system 108, which comprises a navigation filter, for example an extended Kalman filter, in order to be fused with other sensor data, essentially an INS and/or an IMU and of an altimeter, in particular a barometric altimeter.
[0154] The navigation system 108 comprises a position estimating device 111 and a backup sensor device 112. The position estimating device 111 is configured to ascertain an estimated value of the position, for a time at which the evaluation device 104 supplies no data, for example while an SAR radar image 106b is created from the SAR strips 106a. The duration of this time may extend until the adjustable number of SAR strips provided for the image width of the SAR image 106b in the cross-range direction has been acquired.
[0155] In this case, the position estimating device 111 derives the estimated position from the navigation data of the navigation system 108, for example from inertial sensor data such as an INS or an IMU. Navigation data of further backup sensor systems, for example data of an altimeter, is supplied by the backup sensor device 112.
[0156] However, the navigation data may at least in part also be provided by the image capturing device 101 itself. This is because for the purposes of creating the SAR strips 106a, the image capturing device 101 can comprise its own attitude detecting device 101b and its own raw data INS acquiring device 101c in addition to the SAR raw data acquiring device 101a.
[0157] An SAR system 101 inherently comprises an SAR-own navigation system 101c having an INS or IMU, a GNSS receiver, and optionally an own altimeter in order to carry out the movement compensation with a very high update rate and low latency.
[0158] The SAR-own navigation system 101c operates independently of the onboard navigation system 108. The onboard navigation system 108 is that navigation system of the aircraft, which may comprise further backup sensors, for example terrain-referenced navigation (TRN).
[0159] In this case, the onboard navigation system 108, in particular the INS/IMU, calculates correction data for the navigation data of the SAR system 101, with the correction data being applied by the navigation apparatus 100 for the purposes of correcting the navigation data, for example the bearing, the position, and in particular the speed. The functional setup may also be designed such that the SAR-own navigation system of the SAR system 101 independently calculates the corrections to the navigation data on the basis of the navigation data 105 which correspond to the navigation information “extracted” from the SAR image.
[0160] The SAR raw data acquiring device 101a may comprise the radar sensor or SAR sensor 101a and may substantially put together the range data of the radar sensor 101a measured by the radar beam to form the SAR strips 106a. The altitude detecting device 101b is configured to detect and provide the barometric altitude of the navigation apparatus 100. The INS acquiring device 101c is configured to provide movement data such as bearing, angular rates, position, speed and acceleration, which for example are used for a movement compensation required in an SAR system 101. A movement compensation attempts to compensate a flight not in a straight line. This may be used when producing a virtual antenna array, within the scope of which the recordings of a single antenna are juxtaposed in time in order to form a recording of a large antenna array. In this case, the movement compensation may assist with juxtaposing the individual recordings of the antenna lined-up in time despite a flight that is not in a straight line. In the image capturing device 101, all data are merged and provided in the SAR control device 101d. The image capturing metadata XX are also generated here, substantially from the settings data of the SAR sensor 101a.
[0161] The data 106a, XX are provided to the evaluation device 100 via the connecting line 113.
[0162] Via the connecting line 114, the image capturing device 101 in turn receives navigation data from the navigation system 108 and optionally also from the evaluation device 104. Thus, position data generated by the evaluation device 104 or the data further processed by the navigation module 108 based on the data from the evaluation device 104 can be fed back to the image capturing device 101 in order to correct the navigation data based on the INS 101c, which have degraded since the last provision of backup sensor data.
[0163] Position data are transmitted from the evaluation device 104 to the navigation system 108 via the image capturing device 101. The connecting lines 113, 114, 115 can be part of a communications bus system, for example an aircraft bus system.
[0164] A method for determining the position of an aircraft by a synthetic aperture radar (SAR) can be carried out by the navigation apparatus 100.
[0165] A synthetic aperture radar (SAR) is an imaging radar which can be used as a “navigation sensor” by the navigation apparatus 100. An SAR image 106b and the associated image capturing metadata XX, for example the range of the individual pixels and/or the setting of the SAR sensor 101a, serve as a basis for navigation information extraction, correction and optional transformation, in order thus to arrive at an accurate and autonomous navigation solution.
[0166] The navigation apparatus 100 can operate autarkically, that is to say substantially independently of GPS or other external systems 108. In this case, it essentially only uses settings of the image capturing device 101, for example the squint angle, a speed value, or an implemented measurement by the image capturing device 101, for example a range measurement, as navigation information and for the purposes of correcting errors. Substantially no prior knowledge may be required in this case. Hence, the navigation apparatus 100 can readily adopt the functionality should an existing position determination system 108, for example a GPS system, develop a fault.
[0167] A synthetic aperture radar (SAR) is an imaging radar and supplies a black and white image 106, with the brightness reflecting the received power. Substantially no altitude information is contained in the radar image 106.
[0168] This image 106 is created in several stages in the stripmap mode while the image of the scanned region is provided as a whole in the spotlight mode.
[0169] In the strip map mode, an SAR image strip 106a is initially produced by sampling a reflected and in turn received radar signal at fixed sampling times relative to the emission time. This sampling at fixed times, which substantially correspond to the height of the SAR strip 106a, corresponds to a time-of-flight measurement. A unique range can thus be assigned to each sampling time and hence each pixel of the SAR strip 106a.
[0170] The sampling frequency may determine the in-range height of the SAR strips 106a.
[0171] Expressed differently, a distinction should be made between two modes of operation of the SAR sensor 101a, the stripmap mode and the spotlight mode. In the case of the stripmap mode, the SAR image is continually expanded with the scanned and/or swept terrain, in particular using the reflection data produced by the terrain.
[0172] In the case of the spotlight mode, the SAR raw data acquiring device 101a is “aligned” at a certain point or spot and the latter is observed over a relatively long distance, for example a flight route. In the process, the distance travelled is compensated by calculation by way of a movement compensation in order to compensate a movement of the SAR raw data acquiring device 101a, in particular compensate the movement of an antenna from which the radar beams are emitted.
[0173] In relation to the utilized physical measurement principle of the time-of-flight measurement, the two SAR modes are substantially identical. However, on account of the longer “exposure time” caused by the alignment at one point, the spotlight mode allows generation of an SAR image 106b with a higher resolution than in the stripmap mode.
[0174] As the movement advances, the SAR 101a places the physical antenna virtually next to the previous antenna position in relation to the previously emitted radar pulse, in order thus to form a synthetic antenna array. In particular, the produced SAR strips 106a may be computationally juxtaposed in order to produce the SAR image 106b.
[0175] In the case of the spotlight mode, the observation angle or squint angle in relation to a point stationary with respect to ground changes continuously, as a result of which likewise there is continuous change in the Doppler shift. The radar pulses emitted over the flight route are correlated accordingly by way of the expected Doppler shifts. A high resolution is obtained on account of the large distance covered and the long synthetic antenna array formed in this manner.
[0176] The production of an SAR image 106, both in the stripmap and in the spotlight mode, requires a very accurate speed value and/or position value, generated by the altitude detecting device 101b, the INS acquiring device 101c and/or the GNSS contained in the image capturing device 101. As a rule, the very accurate speed value may be generated by a GNSS system, in particular a GPS of the onboard navigation system 108. In the case of a GNSS outage, for example on account of a fault, it is necessary to use alternative navigation data, for example as are generated by the evaluation device 104.
[0177] The resolution in the flight direction is implemented by the observation of the Doppler shift of the received radar pulse and the comparison of the expected Doppler shift on account of the speed of the aircraft and the relative geometry between SAR antenna 101a′ and the observed surround, which is why a very accurate speed value is required.
[0178]
[0179] An altitude value is added in an aircraft, and so the radar beam strikes objects to be observed substantially at an angle.
[0180] All that is depicted of the navigation apparatus 100 in
[0181] An SAR image 106b, in particular an SAR strip 106a, contains substantially no information about the depression angle 203, that is to say about the angle of incidence and angle of reflection of the reflected and received radar signal. Only the range is measured:
[0182] The read times of the received SAR pulse are at certain times in relation to the emission time, corresponding to a certain range. For the case depicted in
[0183] The terrain 202 is scanned at points a, b, c, e. Terrain point d cannot be scanned as is shadowed by terrain point c. It is evident that the SAR image a′, b′, c′, d′, e′ does not correspond to the actual distances between the points a, b, c, d, e, or the horizontal distances a*, b*, c*, d*, e*. A distortion arises. This distortion between the actual distances between points a, b, c, e and the horizontal distances a*, b*, c*, d*, e* is referred to as foreshortening.
[0184] However, this distortion makes a comparison with images difficult, the images for example having been recorded in the nadir direction and having been stored as a template 107.
[0185] As shown using the example of terrain points p, q and the images p′, q′ thereof, terrain points may also be inverted in an SAR image 106b, that is to say a terrain point p, which is located in front of another terrain point q actually in relation to the antenna 101a′, may, as image point p′, be located behind the image point q′ of the other terrain point in the image or in the range image 106a.
[0186] Since the SAR strip image 106a is distorted by the foreshortening, right up to the inversion of terrain points by the layover, it is difficult to interpret the raw data SAR strip image 106a without processing.
[0187]
[0188] The resolution in the flight direction, that is to say the resolution along the speed vector v, what is known as a cross-range resolution, arises on account of the Doppler shift. The Doppler shift is calculated as follows:
[0189] Where λ reflects the wavelength of the radar 101a and r describes the range between the SAR antenna 101a′ and the respective ground point a, b, c, d, e on the ground. Accordingly,
represents the rate of change of the range.
[0190] Under the assumption of an unaccelerated horizontal flight with a speed v and under the assumption that the point to be observed on the ground a, b, c, d, e is visible from the view of the SAR antenna 101a′ at a squint angle θ and a depression angle θ with respect to the speed vector v, the formula for the Doppler shift can be simplified as follows:
[0191] It is still necessary to consider the altitude error and its effect on the squint angle. The altitude error may arise since an assumed reference elevation 201 of the terrain 202 and/or of the terrain profile 202 is assumed for as long as the navigation information is unknown.
[0192] Under the assumption of a flat terrain, the SAR system 101 expects a Doppler shift of f.sub.D, est on the basis of the speed v.sub.est estimated by the INS acquiring device 101c or by the navigation module 108, the commanded squint angle θ.sub.cmd and the absolute height estimated by the height detecting device 101b and the estimated heights h.sub.est above the ground points a, b, c, d, e derived therefrom:
[0193] The squint angle is substantially independent of the orientation or alignment of the antenna 101a′. The orientation of the antenna is irrelevant to an SAR system 101, apart from the antenna gain on account of the antenna lobe, which influences the SNR (signal-to-noise ratio).
[0194] The SAR system 101 only processes those received radar signals 204 that have a Doppler shift of f.sub.D, est.
[0195] An individual radar pulse is considered below for ease of understanding. The emitted radar pulse is reflected by many ground points (back-scattering) and thus returns back to the radar antenna 101a′. The in-range resolution is achieved by sampling the radar pulse at the set sampling times, according to the principle of the time-of-flight measurement. The time-of-flight condition is satisfied by all ground points that have the same distance from the SAR antenna 101a′. Graphically, this corresponds to a circular segment for a flat terrain. More information is required, specifically the Doppler shift, to select only a single ground point from this multiplicity of ground points, and hence to achieve the cross-range resolution. In principle, the Doppler shift can be chosen as desired. Should a certain ground point be intended to be observed or should a certain squint angle be desired, an appropriate Doppler shift can be calculated using the formula above and the SAR system 101 accordingly only processes these components of the received radar pulse.
[0196] On account of the error Δh in altitude estimate, which comprises both the altitude error in respect of the own altitude and the elevation error in respect of the scanned ground point and the assumed reference elevation, however, a different ground point a, b, c, d, e with the parameters r, h.sub.true, θ.sub.true produces the expected Doppler shift f.sub.D, true. Under the assumption that the estimated speed is error-free (v.sub.est=v.sub.true, the ground point (r, h.sub.true, θtrue) thus generates the expected Doppler shift, with the range r or slant range r measured by the radar beam 204 remaining unchanged and the actual altitude difference being h.sub.true=h.sub.est+Δh. Thus, the actual squint angle θ.sub.true=θ.sub.cmd+Δθ differs from the commanded or set squint angle θ.sub.cmd by the squint angle error Δθ:
[0197] In addition to the altitude error Δh, it is also necessary to consider the speed error Δv, and its effect on the squint angle θ.
[0198] For the purposes of this speed error consideration, there is an examination of the influence that the absolute estimated speed has an error of Δy, such that v.sub.true=v.sub.est+Δv applies.
[0199] Once again, a different ground point (r, h.sub.true,θ.sub.true) to the actually expected ground point a, b, c, d, e generates the same expected Doppler shift f.sub.D, true The assumption is made that the elevation above the ground point a, b, c, d, e is error-free h.sub.est=h.sub.true.
[0200] An error-free altitude measurement is assumed in this theoretical consideration in order to be able to mathematically describe the error components. The altitude should be understood to be the altitude above the ground point to be observed, which is why two items of altitude data are required. The one altitude value is the own altitude, which may be ascertained by a barometric altimeter, with a measurement error possibly being included. The other elevation value is the elevation of the ground point to be observed, which is usually unknown and/or only known approximately.
[0201] However, the height of the terrain point may be ascertainable by the templates 107 and, in particular, the landmark metadata YY contained therein.
[0202] The Following Arises:
[0203] Should the estimated speed vector v.sub.est be rotated in relation to the true speed vector v.sub.true through β in the horizontal plane, a squint angle error of Δθ=β is obtained.
[0204]
[0205] In state S401, the SAR system 101 generates an SAR stripe image 106a, from which the SAR image 106b is calculated, in each case with the commanded squint angle θ.sub.cmd and under the assumption of a flat terrain at the reference elevation h.sub.terrain 201, and since there is no knowledge about the own position or only insufficient accuracy in terms of the own position, the actual terrain elevation of the scanned terrain 202 at the respective ground point a, b, c, d, e is unknown. The actual terrain height can be taken into account if the own position is known with sufficient accuracy. The actual terrain height can be ascertained by virtue of ascertaining the terrain elevation of the scanned terrain by a terrain database.
[0206] Image capturing metadata XX, that is to say the settings data, applied to the SAR sensor 101a when capturing the image 108a, are also provided in state S402 together with the SAR image 106a, 106b. These settings data inter alia comprise the commanded squint angle, the range measurement, the validity date of the data.
[0207] In state S403, the evaluation device 104 attempts within the scope of the image evaluation to correlate geo-referenced landscape features 501 contained in the database 102 and/or a geo-referenced landmark 501 with the SAR image 106b. The landscape features 501 contained in the database 102 are provided in state S404 in the form of templates 107 and landmark metadata YY. The landmark metadata YY inter alia comprise the geo-coordinates of the landscape feature, a description regarding the alignment and a resolution of the template or templates.
[0208] To be able to carry out the correlation, the “perspective distortions” of the SAR are within the scope of the preprocessing of the templates 107 in state S405 modelled in the templates 107. Consequently, the templates 107 and the SAR images 106b are normalized with respect to one another.
[0209] The “perspective distortions” of the SAR image 106b can firstly be imitated within the template 107 by a pose hypothesis. In this case, it may substantially be only the relative geometry and the altitude information explicitly or implicitly contained in the template that are taken into account in order to determine the distortion intended to be imitated in the template 107.
[0210] As an alternative to the pose hypotheses, it is possible to ascertain the “perspective distortions” if an attempt is made to retrieve a terrain profile 202 of the template 107 in the SAR image 106b, to compare the distortion of individual distances of the ground points a, b, c, d, e in reality and in the SAR image 106b, and to determine the distortion of the SAR image 106b on account of the radar measuring method therefrom and transfer the latter to the template and thus imitate the distortions in the template.
[0211] Matching the SAR images 106b to fit the templates 107 would only be possible with a sufficiently accurate own position. Moreover, the adaptation and/or normalization of the SAR image 106b is only possible with great computational outlay, and hence the provided templates 107 are usually normalized to the respective SAR image 106b and hence artificially distorted.
[0212] By way of example, the templates 107 are chosen on the basis of a position estimate of the position by the navigation apparatus 100 and substantially contain geo-referenced landmarks 501 or geo-referenced landscape features 501 of a certain terrain section 202.
[0213] The normalization is possible since the position, and hence the terrain profile 202 around the landscape feature 501, and the terrain profile 202 of the landscape feature 501 and the relative geometry between SAR sensor 101a and landscape feature 501 are known.
[0214] The relative geometry is described by the absolute altitude of the aircraft, for example the barometric altitude, recorded by the altitude detection device 101b, the elevation of the landscape feature 501, by the line of sight of the SAR to the landscape feature and by the range r or slant range r to the landscape feature.
[0215] For a successful correlation, it is necessary that the terrain feature 501 is contained both in the selected template 107 and in the SAR image 106b, as a result of which it is possible to ascertain the minimum and maximum possible range of the landscape feature 501 on the basis of the settings data of the SAR.
[0216] To the extent the landscape feature 501 is present in the SAR image 106b, the assumption is made of a pose hypothesis that the terrain feature is situated in the “center” of the SAR image 106b such that the maximum range error is minimized. In particular, the pose hypothesis that the template 107 with the landscape feature 501 is located within the SAR image 106b is assumed.
[0217] The range r to the landscape feature 501 may be a fixed range of the SAR system 101 that is determinable by a radar beam 204. Each pixel of an SAR strip 106a can be assigned a range r. Thus, the SAR sensor 101a can be set in such a way that it supplies strips which have the same ranges, or it can be operated flexibly such that the ranges of the individual SAR strips 106a differ from one another.
[0218] The line of sight is determined by the course of the navigation apparatus 100, for example by the course of the inertial navigation system 101c or of a navigation filter taking account of the commanded squint angle of the SAR sensor 101a. By way of example, the course may be specified as an actual state by an aircraft 502 that uses the navigation apparatus 100.
[0219] By way of example, a navigation filter is a Kalman filter that fuses a plurality of sensor data. In an example, the navigation filter may fuse the data from an IMU and/or an INS, a barometric altimeter or laser altimeter and navigation data extracted from the SAR image.
[0220] The normalization and/or the distorting of the template 107 may be superfluous in an example, specifically if the correction has already been implemented on the part of the SAR system 101; however, this assumes precise knowledge of the own position. Should a GPS be present, the own position can be ascertained by GPS. However, the navigation apparatus 100 should be operational under conditions without GPS. In this case, the own position can be determined sufficiently accurately by the navigation apparatus 100 itself, together with the sensor fusion of IMU and/or INS and barometric altimeter.
[0221] The correlation between the SAR image 106b and the template 107, in particular the distorted or normalized template 107, is implemented in state S406. In this case, common features in the SAR image 106b and the template 107 which can be made congruent, for example the optical structure of a road crossing, are sought optically and using image processing methods. As a result, the template 107 can be pushed to the suitable position in the SAR image 106b, with the template usually covering a smaller landscape portion than the radar image 106b. The correlation may include a homographic transformation in an example.
[0222] Following the correlation of the template 107 with the SAR image 106b, the navigation information contained in the template 107 and/or the landmark meta information YY are transmitted to the navigation module 108 or the navigation system 108. In particular, the geo-referenced position of the landscape feature 501, the range r to the landscape feature 501, the commanded squint angle, the reference elevation 201 of the terrain 202 according to the SAR system 101, and the time of the SAR recording are transmitted to the navigation module 108. By way of example, these specifications can be derived from the image capturing metadata XX supplied with the image 106b and from the landmark metainformation YY.
[0223] The evaluation device 104 and/or the navigation module 108 corrects the squint angle in state S407 on the basis of the actual elevation of the landscape feature 501 in relation to the reference elevation 201 of the SAR system. Furthermore, the navigation module 108 in state S408 fuses the range and the squint angle in relation to the landscape feature 501 with the other sensor data, substantially those of the IMU and/or of the INS 101c and/or of the barometric altimeter 101b, and/or the navigation module 108 calculates the own position in an intermediate step with the aid of the altitude of the aircraft, before the own position is fused.
[0224] In sensor fusion, various sensor data are processed, for example in a Kalman filter, to form a common solution which combines the advantages of the individual sensors. Thus, an IMU and/or INS may have a low noise and a high update rate, but it is not long-term stable and drifts. In contrast thereto, a GNSS, a GPS or else the navigation apparatus 100 and/or the position determination method, for example, may be long-term stable but only supply measurements with a low update rate of approximately 0.05 Hz and may have a large amount of noise between two measurements. However, the drifting IMU and/or INS is substantially supported with the aid of backup sensors in the navigation filter in which the sensor fusion is performed, and hence this is (these are) also supported by the navigation apparatus 100.
[0225] The fused solution now is available in the navigation system 108 and is transferred from the latter back to the image capturing device 101, which requires a very accurate speed value, via the connection 114. The transfer can be implemented either explicitly or implicitly. In the explicit case, the navigation solution provided by the navigation system 108 is used directly by the image capturing device 101. In the implicit case, the correction vector for the inertial navigation system, for example INS or IMU, is transferred to the INS acquiring device 101b in order thereby to independently correct the inertial navigation solution.
[0226] A navigation solution may contain a set of navigation data and/or navigation information, for example the bearing and/or speed and/or position.
[0227] Navigation information is abstract and is any information serving navigation. By way of example, navigation information includes the altitude value of the barometric altimeter 101b or the squint angle in relation to a geo-referenced landmark 501 or the range in relation to a geo-referenced landmark 501. In particular, the navigation information may be raw sensor data.
[0228] Navigation filter refers to a filter which fuses a plurality of navigation data from one navigation sensor and/or from a plurality of navigation sensors to form a navigation solution. In particular, the data of an INS (inertial navigation system) 111 are fused with the data of a backup sensor device 112. By way of example, a GPS system can serve as a backup sensor device 112. In another example, the image capturing device 101, in particular the SAR sensor 101a or the SAR raw data acquiring device 101a and/or navigation data extracted from the SAR strip 106a, can also provide navigation data, in particular the image capturing metadata XX and the landmark metadata YY, for example the geo-referenced position of the landmark 501.
[0229]
[0230] To this end, the SAR image 106b is searched for known landscape features, for example from a template 107 and/or from a database 102 with geo-referenced landscape features 501. To limit the search, the search of the matching templates 107 can be restricted to templates that are substantially situated in the surround expected from the surround scanned by the image capture. The expected scanned surround can be derived by the settings data XX or the image capturing metadata XX of the image capturing device 101, in particular the squint angle θ and the mapped extent range, and the own position of the aircraft 502 being able to be used for deriving the scanned surround.
[0231] In the preprocessing step, which is substantially carried out by the image navigation device 104c, the distortion of the SAR image is modeled with the aid of a terrain profile 202 in the template 107 taken from the template 107 in order to simplify the correlation process of SAR image 106b and template 107 and in order to arrive at the processed image 106c in which the SAR image 106b and the template 107 are substantially overlaid on one another in such a way that they correspond at at least one common landmark 501. In this case, various optional optimizations are possible, for example the selection of the template 107 and/or the selection of a process of the image processing, for example an edge searching method.
[0232]
[0233] Hence, the SAR image 106b can be used for navigation purposes by virtue of the own position, for example a position of the aircraft, being ascertained following the overlay and making congruent process. To this end, it is possible to derive data from the SAR image 106b and the required position data from the provided image capturing metadata XX, for example range, squint angle and time and/or time interval. The provision of the image capturing metadata XX can be implemented in an example by virtue of the latter being read from the SAR image 106b and/or being provided as image capturing metadata XX.
[0234] The result of the correlation is a pixel coordinate of the SAR image 106b, which corresponds to and/or is linked with the pixel coordinate of the geo-referenced landmark 501. Thus, proceeding from the landmark 501 in the template 107, a jump is made into the SAR image 106 at the position of the landmark 501 and the corresponding image capturing metadata XX of the pixel coordinate of the SAR image 106 are read.
[0235] Since the geo-coordinate of the landmark 501 is known from the template 107, a relationship can be established between the pixel coordinate of the landmark 501 in the SAR image 106b and the geo-coordinate of the geo-referenced landmark 501 in the template 107 by correlating and/or matching the template 107 to the SAR image 106b.
[0236] As a result of this knowledge, the image capturing metadata XX or the settings data XX of the SAR sensor 101a, for example the squint angle θ and its measured variables stored in the SAR image 106b, for example the slant range r to the geo-referenced landmark 501 obtained by the radar range measurement, can be related to the geo-referenced landmark 501 in order thus to obtain navigation information in relation to the own position The image capturing metadata XX substantially describe the relative geometry of the SAR antenna 101a in relation to the recorded terrain profile 202.
[0237] Further image capturing metadata XX, for example a speed or altitude value, can be provided from the SAR-own measuring devices, such as the INS acquiring device 101c and the altitude detecting device 101b.
[0238] All data derived either in the SAR image 106b, from the image capturing metadata XX and/or from the landmark metadata YY can be fused to form a data pool, from which the navigation information then can be derived. In particular, the own position may be derivable from this data pool.
[0239] Additional data may be derived from the dedicated onboard navigation system 108 or the inertial navigation system 108.
[0240]
[0241] The aircraft 502, for example an airplane 502, and/or the radar sensor 101a move at the flight altitude h_AC (AC, aircraft), 810.
[0242] An SAR image 106b has distortions on account of the perspective and/or the terrain profile 202. Expressed differently, this means that distortions occur on account of the relative geometry of the radar sensor 101a in relation to the terrain profile 202.
[0243] The distortions in the SAR image 106b can only be corrected with difficulties since the accurate own position and hence the elevation of the scanned terrain 202 are generally unknown.
[0244] By contrast, the distortions of the SAR image 106b can be determined and reproduced in the template 107 since the position of the template 107 and the terrain of the template 107 and, approximately, the relative geometry between SAR sensor 101a or SAR raw data acquiring device 101a and template 107 are known.
[0245] The in-range shift in relation to the center pixel 701′ of the SAR image 106 is calculated below. In the process, the assumptions and/or pose hypotheses that the geo-referenced point 701 of the landmark 501 of the template 107 is located centrally in relation to the in-range axis of the SAR image 106, the squint angle θ remains constant for the recording time period of the relevant portion of the SAR image 106 with the contained landscape feature 501, and the aircraft 502 carries out a substantially straight movement without altitude changes are made.
[0246] The pose hypothesis that the landscape feature 501 can be found in the center of the SAR image with respect to the in-range axis is optimal within the meaning of the maximum range error should only a single pose hypothesis be tested since, for a successful correlation in the worst-case scenario, the landscape feature 501 can be situated at the lower or upper image edge, that is to say the landscape feature can be situated at the minimum or maximum range and the maximum error consequently is half the difference between the maximum range and minimum range. The range r has a significant influence on the relative geometry, and hence on the distortion. Therefore, the similarity of the distortion between the normalized template under the assumption that the landscape feature 501 is situated in the center of the SAR image 106 and the SAR image is likewise optimal.
[0247] Should a plurality of hypotheses be tested, the hypotheses can be chosen so that the maximum error in respect of the range is minimized, but also that the maximum error in respect of the other relevant settings parameters is minimized at the same time.
[0248] The absolute value describes the distance in the SAR image plane 601 at flight altitude h_AC, 810 of the ground points “a” and “center” imaged in the SAR image plane. The ground points “a” and “center” are located on the terrain profile 202, with the ground point “a” having an elevation h_a. The ground point “center” corresponds to the center pixel 701′ of the SAR image 106b, that is to say is linked to the center pixel 701′, and is located at the landmark elevation h_center, 805. The link between the ground point “center” and the center pixel 701′ of the SAR image 106b is established in the case of a successful correlation of the template.
[0249] No connection between individual ground points a, b, c, d, e, center and pixel points a′, b′, c′, d′, c′, center′ of the SAR image 106b is established when determining the distortion by way of a pose hypothesis and an inclusion of the relative geometry, instead all that is assumed is a distortion on account of the relative geometry.
[0250] In an alternative to test the pose hypothesis, an attempt is made to assign to each ground point a, b, c, d, e, center of the terrain profile 202, which can be gathered from the database 102 as ground points a*, b*, c*, d*, c*, center* in the form of the template 107, a corresponding pixel point a′, b′, c′, d′, c′, center′ of the SAR image 106b in order to be able to determine the distortion between SAR image 106b and template.
[0251] In the case of the correlation that follows the distortion imitation or normalization by way of the terrain profile 202, it should be observed that the center pixel 701′ is situated in the SAR image 106b and, for example, can be connected to a corresponding geo-referenced pixel 701 of the landmark 501 in the template 107. The center pixel 701′ is assumed by the pose hypothesis that the landmark 501 known with its geo-coordinates and/or landmark metadata YY from the template 107 is located in the center of the SAR image 106b with respect to the in-range axis of the SAR image and will be found there.
[0252] Furthermore, for the normalization, the assumption is made that the absolute value is the horizontal range between the ground points center and a projected onto the reference plane 602. The reference plane 602 is located at the assumed reference elevation h_ref, 603.
[0253] The shift s.sub.in in-range of a distance of a ground point a in relation to the center pixel 701′ when imaging a horizontal range in the reference plane onto the SAR image plane 601 is calculated as:
s.sub.in=−
s.sub.in=(√{square root over (r.sub.center.sup.2−(h.sub.AC−h.sub.center).sup.2)}−√{square root over (r.sub.α.sup.2−(h.sub.AC−h.sub.α).sup.2)})−(r.sub.eenter−r.sub.α)
[0254] Consequently, image representations in the radar image 106 are described by these equations or, expressed differently, this is how the distortion existing between the template 107 and the radar image 106 on account of the relative geometry and the terrain profile is ascertained.
[0255] The template 107 is an entry in a reference database 102 and substantially represents an image with a landscape feature 501, for example an extracted road crossing.
[0256]
[0257] A representation of individual geo-referenced pixels of a template entry 107, for example of a map portion, is considered in
[0258] Thus, for example the center pixel 701′ may also relate to the SAR image 106b and be chosen in such a way that it is located in the center of the SAR image 106, while a geo-referenced pixel 701 relates to a geo-referenced landmark 501 in the template 701 and initially has no particular position and consequently need not necessarily be located in the center of the template 107.
[0259] However, a relationship and/or a link is established between the center pixel 701′ and the geo-referenced pixel of the geo-referenced landmark 501 as a result of the correlation.
[0260] A noteworthy piece of meta-information YY of the template 107 is the geo-referenced pixel geoTag_Ila 701 (latitude longitude altitude) and/or the geo-referenced point geoTag_Ila 701 of the template 107. A template 107 may also comprise a plurality of geo-referenced points geoTag_Ila 701. This referenced point geoTag_Ila 701 specifies the pixel coordinates geoTag_x and geoTag_y of the landmark 501 in the template 107. It is situated in an xy-coordinate system of the template 107 and can be chosen as desired Unlike the center pixel 701′, for example, it is not distinguished by virtue of the fact that it is chosen to be in the center.
[0261] Expressed differently, the pose hypothesis is chosen freely and hence the center pixel 701′ need not be chosen to be in the image center. This may mean that a certain slant distance r to the landmark 501 is assumed for the purposes of calculating the relative geometry, in order to be able to estimate the distortion. This estimate may be carried out without considering the SAR image 106b. This slant distance r, in particular the pose of the landmark 501 in the SAR image, can be chosen between the extremal positions of the in-range extent at the lower image edge of the SAR image and at the upper image edge, at which the landmark could be found.
[0262] However, as a hypothesis, the assumption is made for the choice of the slant distance r that the landmark 501 is located in the center of the template 107 and hence a slant distance r which is located in the center between the two extremal values of the slant distance at the portion boundaries of the SAR image is chosen as a hypothesis.
[0263] On the basis of this assumption, it is possible to estimate the distortion intended to be imitated in the template substantially without having to resort to the SAR image 106b. How good the estimate and/or the assumption about the pose hypothesis was is then found during the correlation of the SAR image 106b and the template 107, in particular the landmark 501 of the template 107.
[0264] The assumption that the landmark 501, in particular the representation of the landmark 501 in the SAR image 106b, can be found in the center of the SAR image 106 in respect of the in-range axis is optimal when using only a pose hypothesis within the meaning of the maximum deviation of the similarity of the distortion between template and SAR image 106b and substantially serves to reduce the computational outlay. As an alternative to the assumption that the landmark 501 from the template 107, which should be found in the SAR image 106b, is located in the center of the SAR image 106, it is possible to make the assumption that the selected template 107 is located in the center of the SAR image 106b.
[0265] The abbreviation Ila represents the term “latitude longitude altitude”, that is to say the geo-referenced position of the corresponding pixel in the “real” world which may be represented by the template 107.
[0266] The abbreviations res_x_m and res_y_m describe the resolution of the template in the x-direction and y-direction, respectively, that is to say for example a scale and consequently the number of meters a pixel corresponds to in reality. By way of example, the resolution can be chosen as a resolution of 2 m. This information is implicitly contained in the geo-referenced position of the corners of the template 107. However, there are many equivalent options for representing the necessary information.
[0267] The metadata YY of the template 107 additionally required for preprocessing purposes are specified in
[0268]
[0269] Furthermore, the template contains the terrain elevation for all pixels or, alternatively, the terrain elevations of all pixels can be derived by a terrain database. Only a few pixels may also have a terrain elevation in an example and the terrain elevation of the other pixels can be determined with the aid of an interpolation of an elevation of the surrounding pixels. Alternatively, the elevation of the geo-referenced point 701 can be assumed for all points. These are various examples of how the landmark metadata YY can be determined.
[0270]
[0271] In this case, the geometry is shown between SAR sensor 101a, SAR image 106b and template 107. Together with the aircraft 502, the sensor 101a is at the flight altitude h_AC 810, the ground point imaged in the center pixel 701′ is situated at the assumed elevation h_center 805, such that the latter corresponds to the landmark elevation 805. Further, the sensor 101a is situated at a slant range of r_center 808 and a ground range of gr_center 804 in relation to the ground point of the landmark 501 imaged in the center pixel 701′. The absolute squint angle Ψ+θ is spanned between the north direction 806 and the slant range r_center 808. In this case, the course angle Ψ denotes the course in relation to the north direction 806 and may be assumed as a negative angle in
[0272] By way of example, the SAR sensor 101a moves in the aircraft 502 in the direction 807 denoted by v or along the course denoted by the direction 807. In relation to the north direction 806, the SAR sensor 101a moves with the course angle Ψ. The course angle Ψ is specified in relation to north 806. The squint angle θ is specified in relation to the speed vector v. Therefore, the sum Ψ+θ of course angle Ψ and squint angle θ represents the squint angle in relation to north 806. The direction 807 and the north direction 806 run substantially perpendicular to the absolute altitude difference 809 h_AC−h_center.
[0273] The SAR sensor 101a records the SAR image 106b, 106a, which has an extent in the movement direction 802, that is to say in the cross-range direction 802, and an extent in a direction 801 substantially at right angles to the movement direction 802, that is to say in the in-range direction 801.
[0274] That slant range 808 r_center from the SAR sensor 101a to the center line 803, which is measured by the radar beam 204, is constant in relation to the respective recording times of the SAR strips 106a, with the strip map mode having been selected and the assumption of a flight in a straight line at a constant altitude and fixed SAR range to the center line 603 applying. Likewise, the range 804 gr_center of the reference plane to the center line 803 is constant.
[0275] Normally, the SAR system 101 records all SAR strips 106a using a fixed relative geometry. In one example, the minimum range, that is to say the lowermost pixel of the SAR strip 106a and/or the lowermost pixel of the SAR image 106b, may be 3 km and the maximum range, that is to say the uppermost pixel of the SAR strip and/or the uppermost pixel of the SAR image 106b, may be 5 km. This yields an in-range 801 extent of 2 km for the SAR strip 106a and/or the SAR image 106b.
[0276] Consequently, the central pixel 701′ or the center pixel 701′ of the SAR strip 106a and/or of the SAR image 106b is at a range of 4 km from the aircraft 502 at the landmark elevation h_center, 805 and a center line 803, which is likewise a range of 4 km, arises accordingly.
[0277] A similar procedure can be carried out for an SAR image 106b that was recorded by the spotlight mode.
[0278] The reference plane 602 (not shown in
gr.sub.center=√{square root over (r.sup.2−(h.sub.AC−h.sub.center).sup.2)}
[0279] To simplify the correlation, the assumption is made that the center line 803 contains the geo-referenced point 701 of the template 107, and hence the geo-referenced landmark 501.
[0280]
[0281] With the assumption of a flight in a straight line along the movement direction 807 at a constant altitude h_AC (not shown in
[0282] To this end, the horizontal range gr.sub.i, j 901 of each pixel of the template 107, which surrounds the geo-referenced pixel 701 of the landmark 501 and which is contained in the template 107, is calculated. The horizontal range gr.sub.i, j 901 runs parallel to gr.sub.center 804 and ends at the flight track 807′ projected into the horizontal plane.
[0283] The slant range r.sub.i, j of each pixel of the template 107 is calculated according to the formula
r.sub.i,j=√{square root over (gr.sub.i,j.sup.2+(h.sub.AC−h.sub.i,j).sup.2)}
[0284] The in-range shift 902 s.sub.in_i, j=(gr.sub.i, j −gr.sub.center)−(r.sub.i, j r.sub.center) is calculated for each pixel of the template 107.
[0285] In a manner analogous to the in-range shift, it is possible to calculate and apply the cross-range shift. The cross-range shift 903 s.sub.cross in relation to the geo-referenced pixel 701 of the landmark 501 is calculated as follows:
[0286] For the purposes of normalizing and taking account of the distortion, the template 107 is initially rotated through the squint angle correction for the center pixel 701′, that is to say the geo-referenced point 701 of the landmark 501.
[0287] The shifts s.sub.in_i and s.sub.cross_i are carried out by virtue of the template 107 being resampled taking into account the shifts, for example by virtue of the value of the pixel being assigned to the closest pixel or being assigned in proportions to the adjacent pixels. In this case, this is an inverse operation to the bilinear interpolation.
[0288] Consequently, the distortion from the SAR image 106b to be applied to the template 107 is achieved by the shift s.sub.in_i and s.sub.cross_i for each pixel. Since the resolution of an image is discrete, that is to say comprises only an integer number of pixels, but the shift might not be integer, the shifted pixel falls between the original pixel raster. The template 107 is resampled within this meaning. As a result of the shift, color values and/or intensity values are reassigned to the existing pixel raster and/or resampled accordingly, for example by assignment to the closest pixel or by an inverse operation to the bilinear operation.
[0289] If the assumptions made, that is to say a flight in a straight line, a constant flight altitude and/or geo-referenced point 701 of the template 107 is located in the center of the SAR image 106b in relation to the in-range coordinate, are not applicable, the method can nevertheless be applied approximately in order to imitate the distortion of the SAR image in the template 107, which distortion is caused on account of the terrain profile 202 and on account of the relative geometry.
[0290] In particular, instead of the assumption that the geo-referenced point 701 of the template 107, that is to say the geo-referenced point 701 of the landmark 501 in the template 107, is located in the center of the SAR image 106b, it is possible to use the estimated position of the aircraft and/or of the SAR sensor 101a in order to obtain the relative geometry between SAR sensor 101a and template 107 with an improved accuracy, in order thus to better imitate and/or model the distortion of the SAR image 106b in the template 107. Additionally, any other point of the template 107 can be chosen instead of the geo-referenced point 701 of the landmark 501 as a reference for calculating the shift or the distortion.
[0291] Furthermore, following a successful correlation and/or a successful matching of the template 107 and SAR image 106b, 106c, the distortion of the template 107 can now be implemented again with the now available and very accurate relative geometry. This may lead to a good correlation process and ultimately lead to very accurate navigation information.
[0292] In accordance with the basic concept that the position and the terrain profile 202 of the template 107 are substantially known, it is possible to model the distortions in the SAR image 106b in the template 107 even without the assumptions made above, even if this is accompanied by increased computing power and increased complexity, by virtue of varying the influencing variables, that is to say testing a plurality of hypotheses.
[0293] In other words, this may mean that should the distortion of the SAR image 106b not be modeled in the template 107, this may make the correlation process and/or the matching the more difficult, so that the result is substantially inaccurate or the correlation process even fails. The application of the distortion to the template 107 may use prior knowledge, for example the terrain profile 202 of the template 107 and/or corresponding landmark metadata YY, but also assumptions, for example the position at which the template 107 is found in the SAR image 106b. What can be achieved if these assumptions are made before the correlation process is carried out is that the distortion of the SAR image is modeled well in the template 107.
[0294] The relative geometry may be known with good accuracy since the terrain profile 202 of the template 107 is substantially known, for example by way of the landmark metadata YY, which are also stored for the corresponding terrain points, and no assumptions need to be made in this respect. In addition to the perspective, the terrain profile 202 has a great influence on the distortion in the SAR image 106a, 106b. The distortion derived with a great reliability from the relative geometry and/or the terrain profile 202 may therefore be a good approximation to the actual but unknown distortion of the SAR image 106b. Expressed differently, this may mean that the fact that the SAR image 106b has a distortion as a result of the slanted line of sight of the SAR raw data acquiring device 101a is known, but the distortion itself is not. The theoretical extremal values would be looking exactly perpendicular and/or in the nadir direction at terrain points a, b, c, d, e, or looking at these exactly laterally, that is to say at a 90° angle. In fact, the extent of range, that is to say the range between the imaged ground points with a minimum and maximum slant range, of an SAR strip 106a or of the SAR image 106b is known from the settings data XX. Every line of sight and consequently every distortion is possible therebetween.
[0295] So as not to try out all possible distortions, an assumption applicable on average and/or pose hypothesis is made by virtue of assuming that the landmark 507 is located in the center of the SAR image 106b in respect of the in-range axis 801, that is to say on the center line 803. In comparison, this assumed approximate mean distortion applied to the template 107 better reflects the actual distortion of the SAR image 106 than the use of an unprocessed or non-preprocessed and unnormalized template 107, as is taken from the template database 102. Hence, the correlation process may be simplified and the accuracy may be increased if preprocessing and/or the distortion from the SAR image 106b is applied to the template 107.
[0296] After a landscape feature 501 of the database 102 was found or correlated in the SAR image 106b, it is possible to link and/or fuse the landmark metadata YY and the image capturing metadata XX including the measured variables of the SAR image 106b, in particular that slant range r, the metadata serving the determination of the own position. In particular, the range and the squint angle of the SAR antenna 101a′ are known in relation to the geo-referenced landscape feature 501 and the associated validity datum.
[0297] Following the correlation of the template 107 or the landscape feature 501 with the SAR image 106b, the pixel coordinates of the SAR image 106b which correspond to the at least one geo-referenced point or the at least one geo-referenced pixel 701 of the landscape feature 501 are known. From this information it is possible to ascertain the SAR strip 106a which contains the geo-referenced landmark 501. Hence, the commanded squint angle θ is known from the image capturing metadata XX of the SAR strip 106a linked therewith, that is to say a target squint angle which describes the line of sight with which the geo-referenced landmark 501 can be seen without an undertaken correction of the squint angle.
[0298] Following the correlation, the actual pixel coordinate of the landmark 501 and hence also the actual geo-coordinate of the end point of the actual slant range r is known, thus replacing the previously assumed slant range r_center with an actual value. However, in general, the actual pixel coordinate deviates from the assumption and/or pose hypothesis of being located in the center of the SAR image 106b. The center pixel 701′, like the slant range r_center, was only an assumption for normalizing the template and now plays no further role since the true geo-coordinates and the true slant range r of the landmark 501 in the SAR image 106b are known.
[0299] Furthermore, the validity datum, the recording time and/or the recording time interval of the SAR strip 106a, is known. This validity datum describes when the physical measurement of the SAR strip 106a took place and is denoted by @t in
[0300] Further, the metadata XX of the SAR strip 106a contain the measured slant range for each pixel of the SAR strip 106a, and hence the slant range r of the geo-referenced landscape feature 501 from the SAR antenna 101a′ is also known.
[0301] Therefore, the navigation apparatus 100 has a set of navigation information available, for example the slant range r and the squint angle θ of the SAR antenna 101a′ in relation to the geo-referenced landscape feature 501 with the validity datum @t.
[0302] Since the creation of the SAR images 106b repeats periodically, the image data derived from the SAR image 106b is substantially a snapshot, that is to say the state at the recording time of an individual SAR strip 106a. Each SAR strip 106a is assigned to a validity datum, the validity datum describing when the physical measurement was carried out.
[0303] For the stripmap mode, the SAR image 106b comprises a plurality of SAR strips 106a, which each have different validity datums, and the state variables, for example the position, the course, etc., of the aircraft 502 were accordingly different at different times.
[0304]
[0305] Before the squint angle θ is fused in the navigation filter, the former is still corrected for an increased navigation accuracy.
[0306] As already shown, an altitude error Δh may cause an error in the squint angle Δθ. The squint angle error Δθ describes the difference between the actual squint angle θ, at which the landscape feature 501 was in fact recorded, and the expected squint angle θ.sub.cmd or the target squint angle, at which the landscape feature was recorded according to the SAR system 101. However, the actual squint angle θ is not known to the SAR system 101 or the image capturing device 101.
[0307] The altitude error Δh is composed of the altitude error of the own position and the difference between the terrain elevation assumed by the SAR system 101 and the actual terrain elevation of the geo-referenced point. For simplification purposes, a flat terrain is the starting point for the assumed terrain elevation. The altitude error of the own position is not known.
[0308] The error on account of the difference in the terrain height can be approached in at least two different ways.
[0309] In an example, the altitude error Δh may be determined on account of the assumption about the terrain elevation since the elevation of the landscape feature 501 and the assumption about the terrain elevation are known.
[0310] Accordingly, the squint angle error Δθ can be calculated and corrected according to the equation already specified above.
[0311] In another example the SAR system 101 makes the assumption that the terrain elevation corresponds to the elevation h_center, 805 of the geo-referenced landscape feature 501. The prediction and/or hypothesis as to which landscape feature 501 should be expected in the SAR image 106b can also be made with only an approximate own position. As a result, the component of the altitude error Δh on account of the difference between assumed h_center, 805 and actual elevation of the landscape feature is eliminated. If a plurality of landscape features 501 are present in an SAR image 106b, this principle may not be usable or only be usable to a limited extent, for example if it is possible to choose an average elevation of the landscape features for the purposes of reducing the error.
[0312] The SAR sensor 101a should select a reference elevation h_ref, 603, in order to calculate the expected Doppler shift. In general, the SAR sensor 101a uses any desired reference elevation h_ref, 603. If the reference elevation h_ref, 603 is skillfully chosen, that is to say if the reference elevation corresponds to the elevation h_center of the geo-referenced landscape feature 501 of the template 107, it is possible to save the subsequent squint angle correction on account of the non-existent altitude error between reference elevation and elevation of the template.
[0313] Once again, to define a reference elevation h_ref, 603, a reference elevation h_ref, 603 is chosen arbitrarily. However, this is typically chosen in such a way that it corresponds to the actually scanned terrain to the best possible extent.
[0314] Should the reference elevation h_ref, 603 correspond to the elevation of the geo-referenced landscape feature 501 of the template 107, the altitude error between the geo-referenced feature and the reference elevation is zero and accordingly the squint angle error in relation to the geo-referenced feature is likewise zero. That is to say there is no further need for a correction of the squint angle.
[0315] In the case where a plurality of templates 107 or a plurality of geo-referenced landscape features 501 are situated in an SAR image 106b, it is no longer possible to skillfully select the reference elevation h_ref, 603 because the reference elevation corresponds either to the elevation of the one or the other landscape feature 501 of the template 107 but not to both or more landscape features 501 of the template 107 or to a plurality of geo-referenced landscape features 501.
[0316] With this knowledge about possible errors it may be possible to develop an error model of the navigation process.
[0317] In this case, the influence on the state variables should be considered. For example, the position has substantially no influence on the SAR image 106b. All that is required is an approximate position in order to select the possible templates 107.
[0318] The SAR system 101 requires a very precise speed value in order to calculate the expected Doppler shift, with the Doppler shift otherwise contributing to the squint angle error αθ. However, the component of the squint angle error Δθ on account of the relative altitude error between SAR antenna and the surround imaged in the SAR image can be corrected on account of the very accurate elevation value of the geo-referenced landmark.
[0319] The bearing angles have substantially no direct influence on the SAR image 106b. However, the bearing influences the antenna gain in the direction of the terrain to be scanned, and hence the sensitivity of the SAR sensor 101a; this in turn have an influence on the image quality.
[0320] There are various influencing factors which influence the accuracy of the SAR navigation information and, in particular, the accuracy of the image capturing metadata XX.
[0321] Thus, in the case of a range of measurement, the range accuracy σ.sub.r is influenced by the measurement accuracy of the SAR sensor 101a in respect of the range, in particular the accuracy of the time-of-flight measurement, σ.sub.SAR,Entf ernung, the accuracy of the correlation process σ.sub.Korrelationsprozess, in particular the accuracy of the correlation process in the horizontal plane, and the accuracy of the geo-referenced point 501, for example as a result of a database error σ.sub.Datenbank.
[0322] Since the individual error components are independent and using the assumption that the errors have a normal distribution, the resultant range accuracy σ.sub.r can be calculated as follows:
[0323] Angle measurements have a significant influence on the result when using an SAR system 101.
[0324] The angle accuracy σ.sub.θof the squint angle θ may be influenced by the accuracy of the SAR sensor 101a in respect of the angular resolution, for example on account of the Doppler shift OF σ.sub.SAR, angle. However, the angle accuracy σ.sub.θof the squint angle θ may also be influenced by the accuracy of the correlation process σ.sub.Correlation process, in particular the accuracy of the correlation process in the horizontal plane, and by the accuracy of the geo-referenced point 501, for example as a result of a database error σ.sub.database. Likewise, the angle accuracy σ.sub.θof the squint angle may be influenced by the altitude error σ.sub.h remaining after the angle correction, the altitude angle containing the altitude accuracy of the own position and the elevation accuracy of the geo-referenced point, and by the accuracy of the absolute speed σ.sub.Δv and also by the direction error of the speed vector σ.sub.β.
[0325] Since the individual error components are independent and using the assumption that the errors have a normal distribution, the resultant angle accuracy σ.sub.θcan be calculated substantially as follows as the square root of the sum of the squares of the error:
[0326] The navigation filter can carry out tightly-coupled fusion directly on the ascertained navigation information, for example the range and the squint angle in relation to the geo-referenced point 501. To this end, the accuracies of a navigation information are also required.
[0327] Expressed differently, tightly coupled may mean that the sensor measurements are closer to the measurement principle of the sensor but are no longer present in the same state domain in which the navigation filter operates.
[0328] Using the example of GPS, this corresponds to the processing of the range measurements to the individual GPS positions. In this case, there is range domain versus position domain.
[0329]
[0330] In this alternative example, the ascertained range and the ascertained squint angle are transformed into Cartesian position information before they are processed in loosely coupled fashion in the navigation filter, as a result of which there is an increased robustness of the navigation filter.
[0331] In this case, loosely coupled may mean that the sensor measurements are present in the same state domain in which the navigation filter operates. Using the example of GPS, a loosely coupled fusion is based on fusion of the GPS position.
[0332] The ground range is plotted on the abscissa 1001 and the altitude is plotted on the ordinate 1002. Proceeding from the aircraft 502, which is traveling substantially into the plane of the drawing, the radar beam 204, 1003 is emitted in the direction of the geo-referenced landmark 501. In this case, the beam travels the SAR range r. This corresponds to a ground component 1004, gr, which can be calculated. The ground component 1004, gr is at a reference elevation 1005, which corresponds to the elevation h_center, 805 of the geo-referenced land feature 501. The aircraft 502 is at an altitude of 809, 1006 above the reference elevation 1005. By way of example, the absolute altitude of the aircraft h_AC, 801 was ascertained by way of a barometric altimeter.
[0333]
[0334] The east direction 1101, e is plotted on the abscissa and the north direction 1102, n is plotted on the ordinate. Only the ground component 1004 gr of the radar beam 1003 is represented.
[0335] Proceeding from the aircraft 502, which moves along the axis of the aircraft 1103 in the northwesterly direction, the radar beam 1003 is emitted in the direction of the geo-referenced land feature 501.
[0336] The angle between the flight direction 1103 and the radar beam 1003, in particular its grand component 1004, equals the corrected squint angle θcorrected. The ground component 1004 can be divided into a delta-north component 1004n or Δnorth 1004n and a delta-east component 1004e or αeast 1004e on the basis of the corrected squint angle θ.sub.corrected.
[0337] For loose coupling, the horizontal range 1004, that is to say the ground range 1004, gr, to the geo-referenced point 501 is calculated with the aid of the altitude 1006 and the horizontal range 1004 is divided into a north component and an east component with the aid of the corrected squint angle θ.sub.corrected.
[0338] The altitude can be ascertained autonomously in an example, for example by way of a barometric altimeter 101b.
[0339] The ground range 1004 emerges as
gr=√{square root over (r.sup.2−(h.sub.AC−h.sub.center).sup.2)}
[0340] In this case, h_center, 805 is the elevation of the geo-referenced point on the terrain profile 202 which can be ascertained from the landmark metadata YY of the template 107.
[0341] With the corrected squint angle θ.sub.corrected, the delta-north component 1004n emerges as
Δnorth=cos(θ.sub.corrected)gr.
[0342] With the corrected squint angle θ.sub.corrected, the delta-east component 1004e emerges as
Δeast=sin(θ.sub.corrected)gr
[0343] Since this is two-dimensional position information, the latter is divided into a north component 1102 and an east component 1101. However, it is possible to define any orthogonal axes, for example south and west.
[0344] The accuracy of the navigation information required for loosely coupled sensor fusion is described below:
σ.sub.east=√{square root over ((sin(θ.sub.corrected)σ.sub.gr).sup.2+(cos(θ.sub.corrected)grσ.sub.θ).sup.2)}
where σ.sub.r denotes the range accuracy and σ.sub.θdenotes the angle accuracy. Since the range accuracy σ.sub.r already contains the elevation error in the database, σ.sub.h describes only the altitude error of the own position.
[0345] It is additionally pointed out that “comprising” or “having” does not rule out other elements or steps, and “a” or “an” do not rule out a multiplicity. It is also pointed out that features or steps that have been described with reference to one of the above exemplary embodiments may also be used in combination with other features or steps of other exemplary embodiments described above. Reference signs in the claims are not to be regarded as limiting.
[0346] The subject matter disclosed herein can be implemented in or with software in combination with hardware and/or firmware. For example, the subject matter described herein can be implemented in or with software executed by a processor or processing unit. In one exemplary implementation, the subject matter described herein can be implemented using a computer readable medium having stored thereon computer executable instructions that when executed by a processor of a computer control the computer to perform steps. Exemplary computer readable mediums suitable for implementing the subject matter described herein include non-transitory devices, such as disk memory devices, chip memory devices, programmable logic devices, and application specific integrated circuits. In addition, a computer readable medium that implements the subject matter described herein can be located on a single device or computing platform or can be distributed across multiple devices or computing platforms.
[0347] While at least one exemplary embodiment of the present invention(s) is disclosed herein, it should be understood that modifications, substitutions and alternatives may be apparent to one of ordinary skill in the art and can be made without departing from the scope of this disclosure. This disclosure is intended to cover any adaptations or variations of the exemplary embodiment(s). In addition, in this disclosure, the terms “comprise” or “comprising” do not exclude other elements or steps, the terms “a”, “an” or “one” do not exclude a plural number, and the term “or” means either or both. Furthermore, characteristics or steps which have been described may also be used in combination with other characteristics or steps and in any order unless the disclosure or context suggests otherwise. This disclosure hereby incorporates by reference the complete disclosure of any patent or application from which it claims benefit or priority.
LIST OF REFERENCE SIGNS
[0348] 1, 2, 3, 4, 5 Equidistant points [0349] a, b, c, d, e Points of the terrain profile [0350] a′, b′, c′, d′, e′ SAR image representation [0351] a*, b*, c*, d*, e* Horizontal distances [0352] p, q Terrain points [0353] p′, q′ Image representations of the terrain points p, q [0354] 100 Navigation apparatus [0355] 101 Image capturing device or SAR system [0356] 101a SAR raw data acquiring device or SAR sensor [0357] 101a′ Antenna [0358] 101b Altitude detecting device [0359] 101c INS acquiring device [0360] 101d SAR control device [0361] 102 Template database [0362] 103 Correlation device [0363] 104 Evaluation device [0364] 104a Feature selection device [0365] 104b Image generating device [0366] 104c Image navigation device [0367] 105 Output interface [0368] 106 Radar image [0369] 106a SAR strips [0370] 106b SAR image [0371] 106c Processed image [0372] 107 Template [0373] 108 Navigation system [0374] 109 Navigation module [0375] 111 Position estimating device [0376] 112 Backup sensor device [0377] 113, 114, 115 Connecting lines [0378] 201 Reference elevation [0379] 202 Terrain profile [0380] 203 Depression angle [0381] 204 Radar beams [0382] 501 Geo-referenced landmark [0383] 502 Aircraft [0384] 601 SAR image plane [0385] 602 Reference plane [0386] 603 Reference elevation h_ref [0387] 604 Ground point elevation of ground point a [0388] 701 Geo-referenced pixel, geoTag_Ila, [0389] 701′ Center pixel [0390] 702 Corner points, cornerPoints_Ila [0391] 801 In-range direction [0392] 802 Cross-range direction [0393] 803 Center line [0394] 804 Range gr_center to the center line [0395] 805 Landmark elevation h_center [0396] 806 North direction [0397] 807 Movement direction [0398] 807′ Movement direction projected in the horizontal plane [0399] 808 Slant range r_center to the center line [0400] 809 Absolute elevation difference [0401] 810 Flight altitude h_AC [0402] 901 Horizontal range gr.sub.i, j of each pixel of the template [0403] 902 In-range shift [0404] 903 Cross-range shift [0405] 1000 Altitude-ground range diagram [0406] 1001 Abscissa [0407] 1002 Ordinate [0408] 1003 Radar beam [0409] 1004 Ground component [0410] 1004e Delta-east component ground component [0411] 1004n Delta-north component ground component [0412] 1005 Reference elevation [0413] 1006 Altitude above reference elevation [0414] 1100 North-east diagram [0415] 1101 East direction [0416] 1102 North direction [0417] 1103 Flight direction [0418] XX Image capturing metadata [0419] YY Landmark metadata [0420] f.sub.D Doppler shift [0421] λ Wavelength [0422] r (Slant) range [0423] gr Horizontal range/Ground range [0424] θ Squint angle [0425] Δθ Squint angle error [0426] ϕ Depression angle [0427] v Absolute speed [0428] {right arrow over (v)} Velocity vector [0429] Δv Absolute speed error [0430] β Directional error of speed vector [0431] h Altitude [0432] Δh Altitude error [0433] s Shift of a pixel of the template