METHOD FOR MONITORING AT LEAST ONE SENSOR

20220398879 ยท 2022-12-15

    Inventors

    Cpc classification

    International classification

    Abstract

    A method for monitoring at least one sensor in a vehicle. In the method, first data of the at least one sensor are compared to second data, which are provided by at least one further data source, and the function of the at least one sensor is assessed on the basis of this comparison.

    Claims

    1. A method for monitoring at least one sensor in a vehicle, the method comprising the following steps: comparing first data of the at least one sensor to second data provided by at least one further data source; and assessing a function of the at least one sensor based on the comparison.

    2. The method as recited in claim 1, further comprising: recognizing at least one attachment part at the vehicle.

    3. The method as recited in claim 2, further comprising: evaluating possibilities with respect to a probability of an impairment by the at least one attachment part.

    4. The method as recited in claim 1, wherein at least one further sensor is used as at least one of the at least one data source.

    5. The method as recited in claim 4, wherein a synchronous analysis is carried out.

    6. The method as recited in claim 4, wherein an asynchronous analysis is carried out.

    7. The method as recited in claim 1, wherein historic data are used as at least one of the at least one data source.

    8. The method as recited in claim 1, wherein at least one localization is used as at least one of the at least one data source.

    9. The method as recited in claim 1, wherein at least one vehicle property is used as at least one of the at least one data source.

    10. The method as recited in claim 9, wherein at least one static vehicle property is used.

    11. The method as recited in claim 9, wherein at least one dynamic vehicle property is used.

    12. An arrangement for monitoring at least one sensor, the arrangement being configured to: compare first data of the at least one sensor to second data provided by at least one further data source; and assess a function of the at least one sensor based on the comparison.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0021] FIG. 1 shows a temporally synchronous analysis and a temporally asynchronous analysis on the basis of a vehicle, in accordance with an example embodiment of the present invention.

    [0022] FIG. 2 shows a schematic view of a vehicle including an arrangement for carrying out a method in accordance with an example embodiment of the present invention.

    [0023] FIG. 3 shows a possible sequence of the provided method in a flowchart, in accordance with an example embodiment of the present invention.

    DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS

    [0024] The present invention is schematically shown on the basis of specific embodiments in the figures and is described in greater detail hereinafter with reference to the figures.

    [0025] FIG. 1 shows a temporally synchronous and asynchronous analysis. The representation shows a vehicle which is identified as a whole by the reference numeral 10 and which travels on a roadway 12. Furthermore, objects 14 are apparent, in this case trees.

    [0026] Vehicle 10 includes a front camera 16 and a front radar device 18. Front camera 16 provides a first detection range 20 and front radar device 18 provides a second detection range 22. These two detection ranges 20 and 22 are used for a temporally synchronous analysis 25.

    [0027] Furthermore, a side radar device 24 is provided, which provides a third detection range 26. This third detection range 26 in combination with first detection range 20 and/or second detection range 22 is used for a temporally asynchronous analysis 27, which may be carried out alternatively or additionally to a synchronous analysis 25.

    [0028] In the synchronous analysis, it is checked whether objects in the overlapping field of view of two or more sensors are detected by both sensors. If this is not the case, one of the sensors is possibly hidden. The asynchronous analysis uses the characteristic motion information of the vehicle to predict the motion of the object and then to check whether the object is detected by the sensor at the predicted point in time.

    [0029] 1. In a first specific embodiment of the present invention, sensor data are thus available as a further data source.

    [0030] If two or more sensors are available in a vehicle for surroundings detection, for example, radar sensors and video cameras, their depictions of the surroundings may thus be compared to one another. Restrictions of the detection capability or the sensor field of view of these sensors may be derived therefrom. It may advantageously be ascertained whether objects of the real surroundings of the vehicle which are perceptible by the observed sensors are also seen by both sensors.

    [0031] The following is carried out for the temporally synchronous analysis:

    [0032] In the case of overlapping sensor fields of view of the observed sensors, it is compared whether all objects in the overlapping part of the sensor fields of view of the observed sensors are detected by all observed sensors. If objects are not detected by individual sensors or by one individual sensor this thus indicates a restriction of the detection capability of this sensor in the part of the field of view in which the object was detected by other sensors. It is to be noted that if the nature of the impairing surroundings influences is unclear, it may not be reliably ascertained which sensor system correctly reproduces its surroundings. Therefore, a configuration having more than two different or at least differently positioned sensors is advantageous.

    [0033] The following is carried out for the temporally asynchronous analysis:

    [0034] Pieces of information from nonoverlapping parts of the sensor fields of view of the observed sensors may also be used to ascertain restrictions of the detection capability or restrictions of the sensor field of view of individual sensors. It may be ascertained in this case during the trip whether objects which were detected by one or multiple sensors are also detected by other sensors having a differing field of view.

    [0035] As soon as the vehicle has moved further in such a way that these objects are expected in the nominal field of view of the other sensors, a statement may be made about their detection. Both static and also moving objects may be used for this purpose. However, static objects offer the advantage of simpler reproducibility of their expected position relative to the ego vehicle or the nominal sensor fields of view. If expected objects are not detected by one of the sensors in an arbitrarily large part of the field of view, this thus indicates a restriction of the detection capability in this sensor field of view.

    [0036] 2. In a second specific embodiment, localization data are available as a further data source.

    [0037] The localization approach compares the presently detected surroundings to a reference image in order to recognize field of view restrictions.

    [0038] If pieces of information about its surroundings are available in a vehicle, for example, in the form of map data and a GPS localization of the vehicle, these may be used to infer an impairment of the detection capability of a sensor for surroundings detection installed at the vehicle.

    [0039] The pieces of surroundings information obtained from the sensor signals may be compared over time to these externally contributed pieces of surroundings information. In this way, it may be ascertained whether the affected sensor is partially or even completely restricted in its detection capability.

    [0040] If an object is not recognized at the position, which is typically stored in an external database, an impairment of the detection capability may thus be inferred. Those objects are preferably observed which have a unique signal signature with respect to the sensor system used. This may be provided, for example, by a high backscattering cross section of the radar.

    [0041] The process chain may be constructed as follows: [0042] 1. Pieces of external surroundings information from a database are used, which are presently in the nominal sensor field of view. [0043] 2. These pieces of surroundings information are compared to the measured pieces of surroundings information of the sensor. [0044] 3. If 1 and 2 do not coincide, an impairment of the detection capability in the part of the sensor field of view in which the expected surroundings information was not detected may be inferred.

    [0045] A specific example is given hereinafter:

    [0046] A radar sensor is installed laterally at a vehicle and the vehicle is located on the right lane of a road in front of an intersection including a traffic light. In this situation, the traffic light is to be expected at a specific GPS coordinate from the pieces of information from the database. If this coordinate is located in the nominal sensor detection range and if the sensor does not detect the traffic light, it may thus be inferred that the sensor is restricted in its detection capability in the part of the sensor field of view in which the traffic light is located.

    [0047] To increase the stability of the method it may be provided that multiple objects are observed, possibly also over time, and the piece of information obtained therefrom about the sensor restriction is averaged suitably.

    [0048] The described method may be used both for one and also for multiple sensors. Furthermore, one or multiple external surroundings information sources may be used in combination, for example, map data and GPS localization. Furthermore, data may also be used which contain surroundings data detected directly by a sensor system combined with GPS coordinates, for example, a so-called radar road signature, pieces of information on road condition and type, repeatedly detected surroundings profiles. The term radar road signature includes characteristic measured radar reflections, reflection points, and/or radar landmarks.

    [0049] The last-mentioned point is to be understood, for example, to include commercial vehicles, which often carry out repeating tasks with little variance in the route which is regularly covered. This applies in particular to parcel services or garbage collection trucks, but also to construction site vehicles which are used for a specific construction site. If the sensor system recognizes that there are only minor changes in the regularly detected surroundings, deviations in the surroundings detection with respect to otherwise regularly detected noticeable landmarks may be understood as an indication of a shadowing of the sensor. [0050] 3. In a third specific embodiment, static and/or dynamic vehicle properties are used as a further data source.

    [0051] If pieces of information about characteristic variables of the vehicle geometry and the vehicle statics or dynamics are available in a vehicle, for example, the vehicle center of gravity or parameters of the pitch and roll behavior, these may be used to infer attachment parts. For this purpose, a deviation or change of these parameters may be compared for specific attachments, for example, attachment dimensions, weights, and positions. If the change of the measured parameters corresponds within an error tolerance to the parameters stored for an attachment type, it may be assumed that this type is installed.

    [0052] In a further specific embodiment, measured deviations of the parameters are compared to corresponding threshold values. If these threshold values are exceeded, an attachment part which is not specified in greater detail may thus be presumed.

    [0053] Variants include the use or the comparison of one parameter and also multiple parameters in combination. It is also possible to supply the results of the comparison of all or parts of the observed vehicle parameters to a learning network for the purposes of an overall assessment.

    [0054] Possible measured variables are, for example: [0055] Microvibrations due to a snow remover blade, a snowplow, or a snowblower, [0056] Vehicle rolling due to mowing and cleaning attachments, crane attachments, or general protruding attachments, [0057] Intrinsic pitch and roll behavior and longitudinal and lateral jerks during different driving maneuvers.

    [0058] A specific example is given hereinafter:

    [0059] For a snow plow attached to the front side of a commercial vehicle, it is characteristic that this causes a shift of the center of gravity of the vehicle in the direction of the vehicle front. If such a shift of the vehicle center of gravity is detected and this corresponds in its value within an error tolerance to a parameter stored for such a type of attachment, it is established and possibly reported that such an attachment is present on the vehicle. In a further embodiment, the value of the measured shift is compared to a threshold, from which an attachment present on the vehicle may be presumed.

    [0060] FIG. 2 shows a simplified, schematic representation of a vehicle 50, which is equipped with an arrangement 52 for carrying out the method provided herein. Vehicle 50 includes a sensor 54, which has a detection range 56, for which first data 57 may be obtained. It is apparent that detection range 56 is influenced or impaired by an attachment part 58, i.e., detection range 56 is partially hidden and thus reduced. This attachment part 58 thus has an effect on the function or functionality of sensor 50.

    [0061] Furthermore, a further data source 60 is shown which provides second data 61. These second data 61 are compared to first data 57 in arrangement 52. In this way, it may be detected that attachment part 58 is present. It may possibly also be established what this is, attachment part 58 may thus be specified or identified. It is thus possible to describe the attachment part with respect to dimensions or a dynamic characteristic or to name this attachment part.

    [0062] FIG. 3 shows a possible sequence of the provided method in a flowchart. In a first step 80, first data are obtained using a sensor. In a second step 82, second data are provided by a further data source. The first data and the second data are compared to one another in a third step 84 and then, in a fourth step 86, a statement is made about the function or functionality of the sensor and an attachment part is possibly identified. Of course, it is possible to also take into consideration more than two data sources.