METHOD AND DRIVER ASSISTANCE SYSTEM FOR CLASSIFYING OBJECTS IN THE ENVIRONMENT OF A VEHICLE

20220397665 ยท 2022-12-15

Assignee

Inventors

Cpc classification

International classification

Abstract

A method for classifying objects in the environment of a vehicle with the aid of ultrasonic sensors. In the method, ultrasonic signals are emitted, ultrasonic echoes are received from objects in the environment, and the position of a reflection point relative to the ultrasonic sensors is determined using lateration, reflection points being continuously determined and the reflection points being allocated to objects in the environment. Dispersion parameters relating to the position of the reflection points allocated to an object are determined and used as a classification criterion with regard to the type of object. A driver assistance system and a vehicle including such a driver assistance system, are also described.

Claims

1-11. (canceled)

12. A method for classifying objects in an environment of a vehicle, the method comprising the following steps: emitting ultrasonic signals using ultrasonic sensors; receiving ultrasonic echoes from objects in the environment; determining a position of reflection points relative to the ultrasonic sensors using lateration, the reflection points being continuously determined and the reflection points being allocated to objects in the environment; and determining dispersion parameters relating to the position of the reflection points allocated to an object of the objects; are using the dispersion parameters as a classification criterion with regard to a type of the object.

13. The method as recited in claim 12, wherein the dispersion parameters separately indicate a dispersion of the reflection points along two directions orthogonal to each other.

14. The method as recited in claim 12, wherein a central point of the object is determined by averaging the reflection points allocated to the object, and a portion of the reflection points that is located inside or outside a predefined radius around the central point of the object is determined as dispersion parameters.

15. The method as recited in claim 12, wherein a bounding box is determined, which indicates an area in which, with the exception of reflection points determined to be outliers, all reflection points allocated to the object are situated, the dimensions of the bounding box being determined as the dispersion parameters.

16. The method as recited in claim 15, wherein the bounding box has a longitudinal extension and a lateral extension, the longitudinal extension indicating an intensity of a dispersion along a longitudinal direction, and the lateral extension indicating an intensity of the dispersion along a lateral direction.

17. The method as recited in claim 12, wherein an occupancy grid is set up for each of the objects in order to determine the dispersion parameters, and cells of the occupancy grid each have an occupancy value that indicates a number of reflection points allocated to the cell.

18. The method as recited in claim 17, wherein based on the occupancy values of the cells of the occupancy grid, a longitudinal extension and a lateral extension are determined as dispersion parameters, the longitudinal extension indicating an intensity of a dispersion along a longitudinal direction and the lateral extension indicating an intensity of the dispersion along a lateral direction.

19. The method as recited in claim 16, wherein: the longitudinal extension runs parallel to a direction pointing away from the vehicle, and the lateral extension runs perpendicular to the longitudinal extension, or a direction of a greatest extension of the object is determined, and the lateral extension runs parallel to the direction and the longitudinal extension runs perpendicular to the direction, or an object model having a point geometry or a line geometry is allocated to the object by evaluating a relative position of the reflection points allocated to the object.

20. The method as recited in claim 19, wherein the object model having a line geometry is allocated to the object, and wherein the lateral extension runs parallel to an orientation of the line and the longitudinal extension runs perpendicular thereto.

21. The method as recited in claim 19, wherein in the classification, a curb is identified by a large dispersion in a direction of the lateral extension and a low dispersion in a direction of the longitudinal extension, a punctiform object is identified by a low dispersion in the direction of the lateral extension and a low dispersion in the direction of the longitudinal extension, and a pedestrian is identified by a large dispersion in the direction of the lateral extension and a large dispersion in the direction of the longitudinal extension.

22. A driver assistance system, comprising: at least two ultrasonic sensors having at least partially overlapping fields of view; and a control unit; wherein the driver assistance system is configured to classify objects in an environment of a vehicle, the driver assistance system configured to: emit ultrasonic signals using the ultrasonic sensors; receive ultrasonic echoes from objects in the environment; determine a position of reflection points relative to the ultrasonic sensors using lateration, the reflection points being continuously determined and the reflection points being allocated to objects in the environment; and determine dispersion parameters relating to the position of the reflection points allocated to an object of the objects; are use the dispersion parameters as a classification criterion with regard to a type of the object.

23. A vehicle, including a driver assistance system, the driver assistance system comprising: at least two ultrasonic sensors having at least partially overlapping fields of view; and a control unit; wherein the driver assistance system is configured to classify objects in an environment of a vehicle, the driver assistance system configured to: emit ultrasonic signals using the ultrasonic sensors; receive ultrasonic echoes from objects in the environment; determine a position of reflection points relative to the ultrasonic sensors using lateration, the reflection points being continuously determined and the reflection points being allocated to objects in the environment; and determine dispersion parameters relating to the position of the reflection points allocated to an object of the objects; are use the dispersion parameters as a classification criterion with regard to a type of the object.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

[0035] Embodiments of the present invention will be described in greater detail based on the figures and the following description.

[0036] FIG. 1 shows a vehicle having a driver assistance system according to an example embodiment of the present invention.

[0037] FIG. 2 shows a typical dispersion characteristic of a pedestrian.

[0038] FIG. 3 shows a typical dispersion characteristic of a punctiform object.

[0039] FIG. 4 shows a typical dispersion characteristic of a curb.

DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS

[0040] Identical or similar elements are denoted by the same reference numerals in the following description of embodiments of the present invention, and a repeated description of these elements is dispensed with in individual cases. The figures represent the subject matter of the present invention only schematically.

[0041] FIG. 1 shows a vehicle 1, which includes a driver assistance system 100 according to the present invention. In the illustrated example, driver assistance system 100 includes four ultrasonic sensors 12, 13, 14, 15, which are all mounted at the front of vehicle 1 and connected to a control unit 18 in each case. Control unit 18 is appropriately developed to emit ultrasonic signals 20 with the aid of ultrasonic sensors 12, 13, 14, 15 and to receive ultrasonic echoes 22, 24 from objects 30 in the environment of vehicle 1.

[0042] Ultrasonic sensors 12, 13, 14, 15 are situated at the front of vehicle 1 in such a way that at least the fields of view of two ultrasonic sensors 12, 13, 14, 15 overlap at least partially. In the situation depicted in FIG. 1, object 30 is located both in the field of view of a first ultrasonic sensor 12 and in the field of view of a second ultrasonic sensor 13. In the sketched example, first ultrasonic sensor 12 emits an ultrasonic signal 20, which is reflected by object 30. Ultrasonic echo 22 received by first ultrasonic sensor 12 is denoted as a direct echo insofar as it was received by the same ultrasonic sensor 12, 13, 14, 15 that also emitted the original ultrasonic signal 20. A further ultrasonic echo 24, which is received by second ultrasonic sensor 13, is denoted as a cross echo because it was received by a different ultrasonic sensor 12, 13, 14, 15.

[0043] For the classification of object 30, it is provided to determine the position of reflection points 44, see FIGS. 2, 3 and 4, with the aid of lateration, the reflection points indicating at which point the emitted ultrasonic signal 20 was reflected by object 30. This determination is carried out continuously so that a multitude of reflection points 44 is determined. The individual determined reflection points 44 are allocated to an object 30 in each case, it being possible, for instance, to use as criteria for this purpose the distance of reflection points 44 from one another or to use an object model allocated to object 30. Such an object model represents a hypothesis of the form and extension of object 30. For example, it may be assumed as a hypothesis that object 30 involves a punctiform object 34 such as a post. In a different hypothesis it may be assumed that object 30 is a linear object such as a curb or a wall. Reflection points 44 that correspond to the assumptions of this model are then allocated to respective object 30.

[0044] It is furthermore provided in the method to examine the dispersion of the positions of reflection points 44 more closely and to determine corresponding dispersion parameters. These dispersion parameters are then used as a criterion in a classification of the type of object 30.

[0045] The following FIGS. 2, 3 and 4 show the position of the determined reflection points 44 for different typical objects 30. FIGS. 2, 3 and 4 show an object that is located in front of vehicle 1 and thus in the field of view of multiple ultrasonic sensors 12, 13, 14, 15.

[0046] FIG. 2 shows the distribution of reflection points 44 for an object 30 that involves a pedestrian 32. As may be gathered from the illustration of FIG. 2, ascertained reflection points 44 occur in great clusters at the actual position of pedestrian 32. Because of the complex form and structure of pedestrian 32, however, widely dispersed outliers are also detectable, which disperse heavily along a lateral direction 42, in particular. Reflection points 44 also disperse in a longitudinal direction, but a longitudinal extension 40 of the dispersion along the longitudinal direction is considerably lower.

[0047] For an analysis of the dispersion of reflection points 44 sketched in FIG. 2, the direction of the longest extension of object 30, for example, is able to be determined and along this direction the lateral direction be aligned with a lateral extension 42 of the dispersion. The direction of longitudinal extension 40 is consequently orthogonal thereto. Next, a bounding box is able to be set up, that is to say, a frame having a width that corresponds to the extension of the lateral dispersion, and a length that corresponds to the longitudinal extension 40 of the dispersion.

[0048] FIG. 3 exemplarily shows the dispersion of locations allocated to reflection points 44 using the example of a punctiform object 34 as an object 30. In comparison with the example of pedestrian 32 of FIG. 2, it can be seen that both longitudinal extension 40 of the dispersion along the longitudinal direction and the extension of lateral dispersion 42 along the lateral direction are low. The dispersion of reflection points 44 in the case of a punctiform object 34 is low because the point where the ultrasound is reflected is well-defined in such a punctiform object 34 and only minor deviations of reference points 44 occur as a result.

[0049] FIG. 4 sketches the position of reflection points 44 using the example of a curb 36 as object 30. As may be gathered from the illustration in FIG. 4, a curb 36 typically exhibits a broad dispersion in lateral extension 42, and no special clusters of reflection points 44 usually occur at a particular point. In addition, little dispersion along the longitudinal direction is observed so that longitudinal extension 40 of the dispersion is correspondingly low.

[0050] The present invention is not restricted to the described exemplary embodiments and the aspects emphasized therein. Instead, a multitude of variations may lie within the scope of the present invention.