METHOD FOR PROCESSING SENSOR DATA

20230227050 · 2023-07-20

    Inventors

    Cpc classification

    International classification

    Abstract

    A method for processing sensor data in a system that includes multiple sensors for detecting at least a subarea of surroundings around the system. The method includes at least the following steps: a) reading in sensor data detected at least partially in parallel, b) checking whether an at least partial impairment of the detection by the respective sensor may be established for one or for multiple of the sensors on the basis of the read-in sensor data, c) adapting the use of the sensor data, taking the check from step b) into account.

    Claims

    1. A method for processing sensor data in a system that includes multiple sensors for detecting at least one subarea of surroundings around the system, comprising the following steps: a) reading in sensor data detected at least partially in parallel by the sensors; b) checking, on the basis of the read-in sensor data, whether an at least partial impairment of a detection by a respective sensor of the sensors may be established for one or for multiple of the sensors; c) adapting the use of the sensor data taking the checking from step b) into account.

    2. The method as recited in claim 1, wherein one or multiple of the sensors are camera sensors.

    3. The method as recited in claim 1, wherein the checking in step b) takes place based on a comparison of detections by different sensors of the sensors.

    4. The method as recited in claim 1, further comprising: providing at least one piece of information about: a selected sensor data stream, and/or an established impairment of the detection, and/or a position of a sensor of the sensors for a main data path changed due to the selection.

    5. The method as recited in claim 1, wherein the sensors are two optical sensors of a stereo camera.

    6. The method as recited in claim 1, wherein the sensor data or sensor data streams are processed at least partially separately from one another.

    7. The method as recited in claim 1, wherein the system is a system for at least semi-assisted and/or automated driving.

    8. A non-transitory machine-readable memory medium on which is stored a computer program for processing sensor data in a system that includes multiple sensors for detecting at least one subarea of surroundings around the system, the computer program, when executed by a computer, causing the computer to perform the following steps: a) reading in sensor data detected at least partially in parallel by the sensors; b) checking, on the basis of the read-in sensor data, whether an at least partial impairment of a detection by a respective sensor of the sensors may be established for one or for multiple of the sensors; c) adapting the use of the sensor data taking the checking from step b) into account.

    9. A system, comprising: multiple sensors configured for an at least partially parallel detection of at least one subarea of surroundings around the system; one or multiple units configured to check, based on sensor data detected by the sensors, whether an at least partial impairment of a detection by a respective sensor of the multiple sensors may be established for one or for multiple of the sensors; a unit configured to select a main sensor data stream or sensor data stream for a main data path.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0041] FIG. 1 schematically shows an exemplary flowchart of a method according to the present invention presented herein.

    [0042] FIG. 2 schematically shows an exemplary potential application of the method according to the present invention presented herein.

    [0043] FIG. 3 schematically shows an illustration of one exemplary situation, in which the method according to the present invention presented herein may be advantageously applied.

    [0044] FIG. 4 schematically shows an exemplary flowchart of one advantageous embodiment variant of the method presented herein, and

    [0045] FIG. 5 schematically shows an exemplary design of one advantageous embodiment variant of the system according to the present invention presented herein.

    DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS

    [0046] FIG. 1 schematically shows an exemplary flowchart of a method presented herein. The method is used for processing sensor data in a system 1 that includes multiple sensors 2, 3 for detecting at least one subarea of surroundings around system 1. The order of steps a), b), and c) represented by blocks 110 and 120, and 130 is exemplary and may, for example, be run through at least once in the order represented for carrying out the method.

    [0047] In block 110, sensor data detected, in particular, temporally at least partially in parallel are read in according to step a), preferably from different, in particular, at least partially parallel sensor data streams, by sensors 2, 3 having, in particular, at least partially overlapping detection areas 4, 5.

    [0048] In block 120, a check takes place according to step b) as to whether an at least partial impairment of the detection by respective sensors 2, 3, in particular, an at least partial blindness of respective sensors 2, 3 may be established for one or for multiple of sensors 2, 3 on the basis of the read-in sensor data.

    [0049] In block 130, an adaptation of the use of the sensor data takes place according to step c) taking the check from step b) into account, a selection of a main sensor data stream or sensor data stream for a main data path 11 of system 1 from the different sensor data streams, in particular, taking place.

    [0050] For example, one or multiple of sensors 2, 3 may be camera sensors and/or one or multiple of the sensor data streams may be image data streams.

    [0051] Optionally, a provision of at least one piece of information about: [0052] the selected (main) sensor data stream, and/or [0053] an established impairment of the detection, and/or [0054] a position of sensor 2, 3 for main data path 11 changed due to the selection

    [0055] may take place in block 140 according to step d).

    [0056] For example, the sensor data or sensor data streams may be processed, in particular, checked, at least partially separately from one another. Furthermore, system 1 may be a system for at least semi-assisted and/or automated driving (cf. FIG. 2).

    [0057] FIG. 2 schematically shows one exemplary potential application of the method presented herein. In this context, FIG. 2 shows by way of example and schematically a top view onto a vehicle 13 including three cameras 14, which are mounted behind the windshield and face forward. At least two of cameras 14 may, for example, be used as sensors 2, 3 for the method described herein.

    [0058] FIG. 3 schematically shows an illustration of one exemplary situation, in which the method presented herein may be advantageously used. In this context, FIG. 3 shows an example of one-sided blindness of a stereo imager pair. The left imager has a clear view, whereas the right imager is largely blind as a result of rain. An “imager” here describes an example of an optical sensor or image sensor. The stereo image pair may be part of a stereo camera.

    [0059] Thus, FIG. 3 also illustrates an example of the fact that, and optionally of how, sensors 2, 3 may be the two optical sensors of a stereo camera. Furthermore, FIG. 3 also shows an example of the fact that, and optionally of how, the check in step b) may take place on the basis of a comparison of, in particular, at least partially redundant detections of different sensors 2, 3.

    [0060] FIG. 4 schematically shows an exemplary flowchart (block diagram) of one advantageous embodiment variant of the method presented herein. The method may include multiple, here, for example four, steps.

    [0061] In block 210, a reading out of multiple data streams may take place. This may represent an example of the fact that, and optionally of how, according to step a) a reading in of sensor data detected at least partially in parallel may take place.

    [0062] A reading in of image data streams in particular, may take place. In this case, an arbitrary number of temporally synchronous image data streams may be read in. The number may range from two up to a dozen or more data streams.

    [0063] In block 220, a recognition of the (partial) blindness for each image data stream may take place. This may represent an example of the fact that, and optionally of how, according to step b) a check may take place as to whether an at least partial impairment of the detection by respective sensor 2, 3 may be established for one or for multiple of sensors 2, 3 on the basis of the read-in sensor data.

    [0064] A blindness for each image data stream may be established, in particular, individually (cf. below regarding failure recognition). A blindness in the technical sense may also be present here, in particular, when random or persistent hardware errors occur. These may come about as a result of aging, cosmic radiation or mechanical damage.

    [0065] In block 230, a selection of an image data stream for the main data path may take place. This may represent an example of the fact that, and optionally of how, according to step c) an adaptation of the use of the sensor data may take place taking the check from step b) into account.

    [0066] The main image data stream may be selected, in particular, on the basis of the blindness. In the case of more than two image data streams, further, supplemental data streams may be selected, for example, for a stereo disparity.

    [0067] If only a partial blindness is present, system functions that normally require multiple image data streams may advantageously also be partially maintained. For this purpose, additional data paths may be selected in such a way that the residual information content is preferably large. This may also take place on the basis of further pieces of information. For a stereo system, for example, the image sensor whose blindness in the area of the road or of other relevant regions is preferably minimal, may be particularly important as a secondary data stream.

    [0068] In block 240, a provision of the image data, position of the camera, and blindness status in the system may take place. This may represent an example of the fact that, and optionally of how, according to an optional step d) a provision of at least one piece of information about: [0069] the selected (main) sensor data stream, and/or [0070] an established impairment of the detection, and/or [0071] a position of sensor 2, 3 for main data path 11 changed due to the selection

    [0072] may take place.

    [0073] The information about the selected data streams, in particular, may be provided in the system. In addition, data about the blindness per se and about the change of the camera position of the main data path may be advantageously sent or provided. The changed camera position, in particular, may be an advantageous piece of information for the entire system, for example, for assigning calibration data and for algorithms for depth reconstruction.

    [0074] FIG. 5 schematically shows an exemplary design of one advantageous embodiment variant of system 1 presented herein. In this context, FIG. 5 shows by way of example and schematically a selection of main data path 11 in a stereo system. The blindness ascertained by way of example advantageously serves as a control variable for the selection of the main data stream.

    [0075] System 1 is suitable, in particular, for a vehicle drivable preferably at least in a semi-assisted and/or automated manner. System 1 is configured, in particular, for carrying out a method presented herein.

    [0076] System 1 includes multiple sensors 2, 3 for the, in particular, temporally at least partially parallel detection of at least one subarea, in particular, of at least partially overlapping detection areas 4, 5 of surroundings around system 1, the sensors 2, 3 being able to provide the detected sensor data preferably in the form of sensor data streams via data paths 6, 7 extending at least partially separately from one another.

    [0077] System 1 includes one or multiple units 8, 9 for checking whether an at least partial impairment of the detection by relevant sensors 2, 3, in particular, an at least partial blindness of respective sensors 2, 3, may be established for one or for multiple of sensors 2, 3 on the basis of the sensor data detected by sensors 2, 3.

    [0078] System 1 includes a unit 10 for selecting a main sensor data stream or sensor data stream for a main data path 11, in particular, including a switch 12 for switching between the different sensor data streams or data paths 6, 7.

    [0079] System 1 may advantageously adequately degrade relative to the remaining sensor availability.

    [0080] Camera systems may be used, in which cameras 14 (cf. FIG. 2) do not face forward or face forward not exclusively in parallel to one another. Thus, for example, laterally aligned cameras, which look forward only to a small extent, may partially compensate for a blind front camera. A camera belt may preferably be used.

    [0081] According to one particularly preferred embodiment variant, an availability-based selection of the main image data stream, in particular, with application for (semi-)automated driving, may be specified.

    [0082] Possible options for a failure recognition are described below:

    [0083] For advantageously assuring the function of a camera system 1, including one or multiple cameras 14, it may be checked for each image sensor 2, 3 at regular intervals whether a clear view of the surroundings is present. This may include, for example, a recognition of external disruptions in the sight path (blindness) or recognition of technical defects (hardware errors). Both cases may result in a partial or complete failure or of a degradation of camera 14. Failures may be permanent or temporary. In the case of a sensor failure, system 1 may advantageously adequately degrade, for example, may cease functioning partially or entirely (cf. FIG. 3).

    [0084] Depending on how granularly camera degradation is measured (for example, in time, location, cause and effect) and how reliably that occurs, a more or less granular system degradation may be implemented.

    [0085] A blindness recognition may be implemented in different ways. One possible approach is the observation of the movement in a scene. Simply put, when a movement is present in the image or in sections of the image, a clear view may be assumed. Conversely, however, the absence of movement is not indicative of a blindness. A lack of movement is thus by way of example merely a necessary but not a sufficient condition.

    [0086] One further approach is the classification of blindness with the aid of a neural network (for example, via deep learning). For this purpose, training data of image sequences having complete, partial or non-existing blindness may be used in order to teach a neural network precisely these states.

    [0087] For overlapping image sections, the comparison of two image data streams may also be particularly advantageously utilized. If the observed scene between two data streams with overlapping fields of view 4, 5 is different, this is a particularly advantageous indicator of a blindness.

    [0088] In the case of a degradation of a driver assistance system caused by blindness, the driver may immediately take control of vehicle 13. Each degradation may represent a loss of comfort and thus a loss of customer benefit, but also safety functions such as an emergency brake application then may no longer be available. Worse still, a degradation may affect highly automated systems. In the worst case, it may result here in an abort of the driving operation.

    [0089] For this reason, the maximization of the system availability gains increasingly in importance. In the case of systems including partially redundant information sources, in particular, such as for example, in a stereo camera, it is advantageous to prevent or mitigate the system degradation in the case of a partial blindness by advantageously utilizing the redundant data source.

    [0090] One preferred embodiment variant is formed here by a stereo camera (cf. FIG. 5). A stereo camera (system 1 in FIG. 5), in which the main data path 11 may be switched between left and right imagers 2, 3 in accordance with a blindness signal, is particularly preferred.

    [0091] In other words, this relates, in particular, to a stereo camera, in which the main image data stream may alternate between left and right sensor 2, 3 on the basis of the blindness. An exemplary data flow is illustrated in FIG. 5. The blindness information advantageously serves as a control variable for the switching of the main image data stream.

    [0092] In this embodiment variant, it is further advantageous that camera sensors 2, 3 have a large or nearly complete overlapping area. The redundant image information here may be particularly advantageously utilized, resulting in a particularly major advantage for the image data availability.

    [0093] A combination including a hardware separation may be provided as a further embodiment variant. One further possibility of improving the system design with respect to availability is to preferably separate the post-processing of the partially redundant image data streams in the hardware. Using a combination including a processing switch 12 of the image data streams, it is possible as a result to advantageously lower the safety load on the hardware.

    [0094] In this regard, an explanation based on the stereo example: Even without a switch concept (classical), it may be worth separating the hardware paths, but then an increased safety load may remain on the hardware, which then processes main image path 11—if this fails, image and depth are normally gone. Using a switch concept, a hardware separation may be advantageously improved, because the failure of the hardware may be partially compensated for by the redundant path as a result. If the hardware of main image path 11 fails, the image may be retained via switch 12 and the second path.

    [0095] One advantage of the method described is the increased availability of video data. A system degradation in the case of an individual blind or partially blind image sensor 2, 3 may be advantageously mitigated. In this way, system functions may, if necessary, be maintained or a safer degradation behavior may be implemented.

    [0096] This is advantageous, in particular, for highly autonomous systems, for example, in the area of autonomous driving, where the vehicle occupants temporarily or fully surrender control to the vehicle. The autonomous system may thus potentially still complete missions, if necessary, using modified planning, or may also merely maintain the safety for achieving a safe state. The switch system described may be particularly advantageous for a preferably safe management of one-sided blindness.

    [0097] In the area of driver assistance as well, the method may provide an additional customer benefit as a result of the increased availability, in particular, through increased comfort and greater availability in safety functions, for example, emergency brake application or automatic avoidance of obstacles.

    [0098] In certain system designs, the method described herein could also serve as an alternative to the equipping of a cleaning system, or favor the choice of a weaker cleaning system.