METHOD AND DRIVER ASSISTANCE SYSTEM FOR CLASSIFYING OBJECTS IN THE ENVIRONMENT OF A VEHICLE
20220397665 ยท 2022-12-15
Assignee
Inventors
Cpc classification
G01S2015/935
PHYSICS
G01S2015/465
PHYSICS
G01S7/539
PHYSICS
International classification
Abstract
A method for classifying objects in the environment of a vehicle with the aid of ultrasonic sensors. In the method, ultrasonic signals are emitted, ultrasonic echoes are received from objects in the environment, and the position of a reflection point relative to the ultrasonic sensors is determined using lateration, reflection points being continuously determined and the reflection points being allocated to objects in the environment. Dispersion parameters relating to the position of the reflection points allocated to an object are determined and used as a classification criterion with regard to the type of object. A driver assistance system and a vehicle including such a driver assistance system, are also described.
Claims
1-11. (canceled)
12. A method for classifying objects in an environment of a vehicle, the method comprising the following steps: emitting ultrasonic signals using ultrasonic sensors; receiving ultrasonic echoes from objects in the environment; determining a position of reflection points relative to the ultrasonic sensors using lateration, the reflection points being continuously determined and the reflection points being allocated to objects in the environment; and determining dispersion parameters relating to the position of the reflection points allocated to an object of the objects; are using the dispersion parameters as a classification criterion with regard to a type of the object.
13. The method as recited in claim 12, wherein the dispersion parameters separately indicate a dispersion of the reflection points along two directions orthogonal to each other.
14. The method as recited in claim 12, wherein a central point of the object is determined by averaging the reflection points allocated to the object, and a portion of the reflection points that is located inside or outside a predefined radius around the central point of the object is determined as dispersion parameters.
15. The method as recited in claim 12, wherein a bounding box is determined, which indicates an area in which, with the exception of reflection points determined to be outliers, all reflection points allocated to the object are situated, the dimensions of the bounding box being determined as the dispersion parameters.
16. The method as recited in claim 15, wherein the bounding box has a longitudinal extension and a lateral extension, the longitudinal extension indicating an intensity of a dispersion along a longitudinal direction, and the lateral extension indicating an intensity of the dispersion along a lateral direction.
17. The method as recited in claim 12, wherein an occupancy grid is set up for each of the objects in order to determine the dispersion parameters, and cells of the occupancy grid each have an occupancy value that indicates a number of reflection points allocated to the cell.
18. The method as recited in claim 17, wherein based on the occupancy values of the cells of the occupancy grid, a longitudinal extension and a lateral extension are determined as dispersion parameters, the longitudinal extension indicating an intensity of a dispersion along a longitudinal direction and the lateral extension indicating an intensity of the dispersion along a lateral direction.
19. The method as recited in claim 16, wherein: the longitudinal extension runs parallel to a direction pointing away from the vehicle, and the lateral extension runs perpendicular to the longitudinal extension, or a direction of a greatest extension of the object is determined, and the lateral extension runs parallel to the direction and the longitudinal extension runs perpendicular to the direction, or an object model having a point geometry or a line geometry is allocated to the object by evaluating a relative position of the reflection points allocated to the object.
20. The method as recited in claim 19, wherein the object model having a line geometry is allocated to the object, and wherein the lateral extension runs parallel to an orientation of the line and the longitudinal extension runs perpendicular thereto.
21. The method as recited in claim 19, wherein in the classification, a curb is identified by a large dispersion in a direction of the lateral extension and a low dispersion in a direction of the longitudinal extension, a punctiform object is identified by a low dispersion in the direction of the lateral extension and a low dispersion in the direction of the longitudinal extension, and a pedestrian is identified by a large dispersion in the direction of the lateral extension and a large dispersion in the direction of the longitudinal extension.
22. A driver assistance system, comprising: at least two ultrasonic sensors having at least partially overlapping fields of view; and a control unit; wherein the driver assistance system is configured to classify objects in an environment of a vehicle, the driver assistance system configured to: emit ultrasonic signals using the ultrasonic sensors; receive ultrasonic echoes from objects in the environment; determine a position of reflection points relative to the ultrasonic sensors using lateration, the reflection points being continuously determined and the reflection points being allocated to objects in the environment; and determine dispersion parameters relating to the position of the reflection points allocated to an object of the objects; are use the dispersion parameters as a classification criterion with regard to a type of the object.
23. A vehicle, including a driver assistance system, the driver assistance system comprising: at least two ultrasonic sensors having at least partially overlapping fields of view; and a control unit; wherein the driver assistance system is configured to classify objects in an environment of a vehicle, the driver assistance system configured to: emit ultrasonic signals using the ultrasonic sensors; receive ultrasonic echoes from objects in the environment; determine a position of reflection points relative to the ultrasonic sensors using lateration, the reflection points being continuously determined and the reflection points being allocated to objects in the environment; and determine dispersion parameters relating to the position of the reflection points allocated to an object of the objects; are use the dispersion parameters as a classification criterion with regard to a type of the object.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0035] Embodiments of the present invention will be described in greater detail based on the figures and the following description.
[0036]
[0037]
[0038]
[0039]
DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS
[0040] Identical or similar elements are denoted by the same reference numerals in the following description of embodiments of the present invention, and a repeated description of these elements is dispensed with in individual cases. The figures represent the subject matter of the present invention only schematically.
[0041]
[0042] Ultrasonic sensors 12, 13, 14, 15 are situated at the front of vehicle 1 in such a way that at least the fields of view of two ultrasonic sensors 12, 13, 14, 15 overlap at least partially. In the situation depicted in
[0043] For the classification of object 30, it is provided to determine the position of reflection points 44, see
[0044] It is furthermore provided in the method to examine the dispersion of the positions of reflection points 44 more closely and to determine corresponding dispersion parameters. These dispersion parameters are then used as a criterion in a classification of the type of object 30.
[0045] The following
[0046]
[0047] For an analysis of the dispersion of reflection points 44 sketched in
[0048]
[0049]
[0050] The present invention is not restricted to the described exemplary embodiments and the aspects emphasized therein. Instead, a multitude of variations may lie within the scope of the present invention.