Processing of Sensor Data for a Driver Assistance System
20170372150 ยท 2017-12-28
Inventors
Cpc classification
B60W2050/065
PERFORMING OPERATIONS; TRANSPORTING
B60W30/0956
PERFORMING OPERATIONS; TRANSPORTING
G08G1/165
PHYSICS
G06V20/58
PHYSICS
G06V20/588
PHYSICS
B60W50/06
PERFORMING OPERATIONS; TRANSPORTING
B60W2554/00
PERFORMING OPERATIONS; TRANSPORTING
G08G1/167
PHYSICS
International classification
B60W30/095
PERFORMING OPERATIONS; TRANSPORTING
Abstract
In order to process sensor data for a driver assistance system oriented towards the driver's comfort, sensor data that is sensed by a sensor device and describes objects is preprocessed such that a distinction is made between a driving zone and a non-driving zone, where the driving zone is designated as an object driving zone. The object driving zone is delimited by a boundary line. Since the sensor data is processed for a comfort-oriented driver assistance system, it does not have to describe the entire theoretical driving zone. Rather, the boundary line is used to delimit the driving zone within which the vehicle can normally be expected to drive. Based thereon, it is easy to determine an appropriate boundary line and significantly reduce the volume of data to be transmitted from the sensor device to a central control device of the comfort-oriented driver assistance system in order to describe the sensed objects.
Claims
1. A method for processing sensor data for a comfort driver assistance system for a motor vehicle, in which sensor data describing respective location of a plurality of objects are acquired, the method comprising the acts of: extracting coordinates of the plurality of objects from the sensor data, wherein an inner side of an object of the plurality of objects, on which the motor vehicle is intended to drive past the object, is determined; distinguishing a region which can be traveled on and is in front of and/or behind each of the plurality of objects in a direction of travel of the motor vehicle from a region which cannot be traveled on, wherein the region which can be traveled on is an object travel region and wherein a boundary between the object travel region and the region which cannot be traveled on is a boundary line which extends beyond the object to a front and/or to a rear along the inner side of the object in the direction of travel; and determining the boundary line and forwarding data relating to the boundary line from a sensor device to a central control device of the comfort driver assistance system instead of the sensor data describing the object.
2. The method as claimed in claim 1, wherein determining the boundary line comprises connecting inner sides of adjacent ones of the plurality of objects.
3. The method as claimed in claim 2, wherein the plurality of objects are detected by a plurality of different sensor elements.
4. The method as claimed in claim 1, wherein the boundary line is a trajectory which describes a path of the motor vehicle which, coming from an outside in front of the object in the direction of travel, leads past the inside of the object at a minimum distance with a maximum steering angle, and/or describes a path which leads outward from the object with a maximum steering angle for a current driving speed after the object, the maximum steering angle being determined based on a speed of the motor vehicle.
5. The method as claimed in claim 1, wherein the boundary line is determined only if the vehicle is moving at a predetermined minimum speed.
6. The method as claimed in claim 1, wherein inner sides of the plurality of objects are determined as that side of the plurality of objects which is passed by a trajectory of the motor vehicle which is extended in the direction of travel.
7. The method as claimed in claim 1, further comprising performing automatic object recognition, wherein each of the plurality of objects is respectively assigned to a class of a predetermined set of classes of objects.
8. The method as claimed in claim 7, wherein performing automatic object recognition comprises performing automatic object recognition using an image analysis.
9. The method as claimed in claim 1, further comprising detecting moving objects are detected by a speed measurement corresponding to a relative speed between the motor vehicle and the object.
10. The method as claimed in claim 9, wherein said moving objects are not taken into account as obstacles which restrict the region which can be traveled on when determining the region which can be traveled on.
11. The method as claimed in claim 9, wherein said moving objects are taken into account when determining the inner side of static objects.
12. The method as claimed in claim 1, further comprising determining a vehicle travel region, the boundary lines of which are defined by two trajectories each describing a theoretical path of the motor vehicle running to a right or a left in the direction of travel from the current location of the motor vehicle with a maximum steering angle, wherein the vehicle travel region is between the two trajectories, and wherein objects and an object travel region outside the vehicle travel region are ignored.
13. The method as claimed in claim 12, wherein the vehicle travel region is calculated in the central control device and is combined with the object travel regions to form a movement region.
14. The method as claimed in claim 1, wherein a plurality of sensor apparatuses each calculate object travel regions that are forwarded to the central control device.
15. A sensor apparatus comprising: a sensor element configured to acquire sensor data describing locations of a plurality of objects; and a processor controller coupled to the sensor element, wherein the processor controller is configured to: process sensor data received from the sensor element for a comfort driver assistance system for a motor vehicle, extract coordinates of the plurality of objects from the sensor data, an inner side of an object of the plurality of objects, on which the motor vehicle is intended to drive past the object, being defined, distinguish a region which can be traveled on and is in front of and/or behind each of plurality of objects in a direction of travel from a region which cannot be traveled on, wherein the region which can be traveled on is an object travel region, wherein the boundary between the object travel region and the region which cannot be traveled on is a boundary line which extends beyond the object to a front and/or to a rear along the inner side of the object in the direction of travel, and determine the boundary line and forwarding data relating to the boundary line from a sensor device to a central control device of the comfort driver assistance system instead of the sensor data describing the object.
16. The sensor apparatus as claimed in claim 15, where in the sensor element is at least one of a radar, a lidar, a camera, an ultrasonic sensor and a digital map.
17. The sensor apparatus as claimed in claim 15, wherein the sensor apparatus has at least two different sensor elements.
18. A system for processing sensor data for a comfort driver assistance system for a motor vehicle, comprising: at least one sensor apparatus as claimed in claim 15; and a central control device which is connected to the sensor apparatus, wherein the central control device is configured to create an occupancy grid.
19. The system as claimed in claim 18, wherein the system is configured to: extract coordinates of the plurality of objects from the sensor data, wherein an inner side of an object of the plurality of objects, on which the motor vehicle is intended to drive past the object, is determined, distinguish a region which can be traveled on and is in front of and/or behind each of the plurality of objects in a direction of travel of the motor vehicle from a region which cannot be traveled on, wherein the region which can be traveled on is an object travel region and wherein a boundary between the object travel region and the region which cannot be traveled on is a boundary line which extends beyond the object to a front and/or to a rear along the inner side of the object in the direction of travel, and determine the boundary line and forwarding data relating to the boundary line from a sensor device to a central control device of the comfort driver assistance system instead of the sensor data describing the object.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0040] The invention is explained by way of example below using the drawings, in which:
[0041]
[0042]
[0043]
[0044]
[0045]
DETAILED DESCRIPTION OF THE DRAWINGS
[0046] A sensor apparatus 1/1, 1/2 and 1/3 according to the invention respectively has a sensor element 2/1, 2/2 and 2/3 (
[0047] In the present exemplary embodiment, three sensor apparatuses (n=1; 2; 3) are provided and are arranged in a motor vehicle. The first sensor apparatus 1/1 has a radar as a sensor element 2/1, the second sensor apparatus 1/2 has a lidar 2/2, which is also referred to as a laser scanner, and the third sensor apparatus 1/3 has a camera 2/3. The sensor elements 2/n are each connected to one of the processor controllers 3/n. The processor controllers 3/n preprocess the sensor signals output by the sensor elements 2/n.
[0048] The processor controllers 3/n are connected to a central control device 4 to which they forward the preprocessed sensor signals. The sensor signals from the different sensor apparatuses 1/n are combined in the central control device to form an occupancy grid.
[0049] The sensor elements 2/n are used to capture the coordinates of objects 5 along a route in front of a vehicle 7 in the direction of travel 6 (
[0050] The vehicle 7 moves at a predetermined speed which is measured by a corresponding sensor (not illustrated) and is transmitted to the processor controllers 3/n via the central control device 4.
[0051] The present invention is based on the knowledge that static objects 5 are usually arranged beside the region of a road which can be traveled on and the edge of a road can usually be represented by a rectilinear curve or a curve having slight curvature, with the result that the representation of a multiplicity of static objects can be replaced with a boundary line between the region which can be traveled on and the static objects. This boundary line can already be produced in one of the sensor apparatuses 1/n and can be forwarded to the central control device 4 instead of a detailed description of the individual objects 5.
[0052] If the desire is to pass the object 5 on a particular side, which is referred to as the inner side or travel region side 8 below, a region which can be traveled on, which is referred to as the object travel region below, and a region which cannot be traveled on are then distinguished, the boundary line 10 between the object travel region 9 and the region which cannot be traveled on being a trajectory 11, for example, which describes a theoretical path of the motor vehicle 7 which, coming from the outside in front of the object 5 in the direction of travel, leads past the inside of the object 5 at a minimum distance with a maximum steering angle, and/or describes a theoretical path which leads outward from the object 5 with a maximum steering angle for the current driving speed after the object in the direction of travel (
[0053]
[0054] In an alternative embodiment, the current driving speed can also be used to determine the maximum steering angle, but it is taken into account that a deceleration of the motor vehicle is possible, the driving speed being decelerated in the direction of travel 6 with the maximum possible deceleration rate. The slower the driving speed, the greater the maximum steering angle. This means that the curvature of the trajectory increases in the direction of travel.
[0055] The trajectory 11 describes the movement of a lateral edge 12 of the motor vehicle 7 facing the object 5. The motor vehicle 7 must pass the object 5 with this edge 12. For simple post-processing, the trajectory 11 can be transferred inward by half a width b of the motor vehicle 7, with the result that the resulting trajectory 13 describes the movement of a center point of the motor vehicle 7 which leads past the inner side 8 of the object 5 without touching it. A center point of the motor vehicle 7 is, for example, the center of gravity of the motor vehicle 7 or a center of a base of the motor vehicle 7. As a result, the movement of the motor vehicle in the object travel region can be considered to be a movement of a punctiform body provided that the movement in the edge region of the object travel region is tangential to the trajectory 13 bounding the object travel region.
[0056] The trajectories 11, 13 may be represented by functions which can be described very easily in mathematical terms. However, curves, the curvature of which changes over the course, can also be easily described. This is possible, for example, by means of spline functions. The sections along the edge of the object 5 can also be easily concomitantly taken into account hereby. Such a representation in the form of a function describing the boundary line 10 of the object travel region makes it possible to represent the latter using few parameters and therefore with a small volume of data which can be very quickly transmitted from the sensor apparatus 1 to the central control device 4.
[0057] Furthermore, it is possible on the basis of the current location of the motor vehicle and the current direction of travel of the motor vehicle which are determined using the sensor apparatuses 1 with respect to the objects 5 arranged at the front in front of the motor vehicle 7 in the direction of travel 6. A region which can be traveled on is distinguished from a region which cannot be traveled on based on the current location and the current direction of travel of the motor vehicle and on the basis of the driving speed, this region which can be traveled on being referred to as a vehicle travel region 14 (
[0058]
[0059] Furthermore, the trajectories 15/1 and 15/2 may be mapped inward onto the trajectories 16/1 and 16/2 by half the width b/2 of the motor vehicle 7, with the result that they describe the movement of a center point of the motor vehicle 7.
[0060] When determining the maximum steering angle, it is possible to take into account further parameters, for example the current substrate, the weather conditions, in which case the maximum steering angle is lower when the road is wet, the contour of the route, in which case a greater deceleration is possible in the case of a rising route than in the case of a falling route, and other parameters which influence the maximum possible steering angle.
[0061] The object travel regions 9 of the individual objects and the vehicle travel region 14 are combined in the central control device 4, which is also referred to as merging.
[0062]
[0063]
[0064] Connecting a plurality of objects 5 using a common boundary line 10 considerably reduces the volume of data to be transmitted since only data relating to a single boundary line need to be transmitted instead of data relating to a description of a plurality of objects.
[0065] A sensor apparatus 1 may also comprise a plurality of different sensor elements 2 which each detect the objects 5 using a different measurement principle. The individual sensor elements 2 see the different objects 5 with different intensity and it may also be the case that particular sensor elements 2 do not see particular objects and, in contrast, other sensor elements see these objects. If a plurality of objects are connected using a boundary line, this also makes it possible to connect objects which are detected using different sensor elements 2 of the sensor apparatus 1. The advantages of different sensor elements can be combined using such a boundary line.
[0066] In the exemplary embodiment shown in
[0067] If the driving speed changes, the object travel region and the vehicle travel region need to be adapted in real time. The greater the driving speed, the longer also the section in front of the motor vehicle 7 in the direction of travel 6 for which the object travel region needs to be determined. The lower the driving speed, the more finely the representation of the static objects 5 can be resolved. At a low driving speed, a motor vehicle 7 can pass through a narrow gap between two adjacent static objects 5, whereas this is not possible at a high speed.
[0068] Predetermined safety distances to the boundary lines 10 of the object travel regions and of the vehicle travel region can be taken into account as safety zones or safety regions in the central control device 4 when creating an occupancy grid. As a result, accelerations and decelerations of the motor vehicle may also be allowed in a predetermined region without the occupancy grid having to be changed and without the risk of a collision with one of the objects 5. These safety zones or safety regions are not added to the sensor data in the sensor apparatus, however, since the sensor data, even if they have been preprocessed with respect to object travel regions 9, are intended to represent the reality as exactly as possible. In this case, it should be taken into account, in particular, that a plurality of different driver assistance systems which take into account different safety margins or safety zones can be used in a motor vehicle.
[0069] The exemplary embodiments explained above show different embodiments for determining the boundary line 10. There are further possible ways of determining the boundary line 10. For example, the inner sides of static objects may be extended by a predetermined amount in the direction of travel. This produces sections of the boundary line 10. The length of these sections is preferably set on the basis of the driving speed, in which case, the higher the driving speed, the longer these sections are. The individual sections must not result in a continuous boundary line. The representation of the boundary line by means of a plurality of such sections is expedient, in particular, at high speeds at which no severe steering locks are possible in principle. Such rectilinear sections can be transmitted, with few data items, from the sensor apparatuses to the central control device.
[0070] Furthermore, it is possible to carry out automatic object recognition which is used to automatically recognize, for example, road boundary posts and road signs which constitute static objects. Since road boundary posts and road signs are arranged at the edge of the road, they can be connected to one another, in principle, in order to form the boundary line 10.
[0071] Within the scope of the invention, sensor data acquired by means of a sensor apparatus can be processed for a comfort driver assistance system, sensor data describing objects being preprocessed in such a manner that a distinction is made between a region which can be traveled on and a region which cannot be traveled on, the region which can be traveled on being referred to as an object travel region. The object travel region can be bounded by a boundary line. Since the sensor data are processed for a comfort driver assistance system, they need not describe the entire region which can be theoretically traveled on, but rather it suffices if the boundary line bounds the region which can usually be usefully traveled on. This makes it possible, on the one hand, to very easily determine a corresponding boundary line and, on the other hand, to considerably reduce the volume of data which needs to be transmitted from the sensor apparatus to a central control device of the comfort driver assistance system in order to describe the detected objects.
[0072] List of reference symbols:
[0073] 1 Sensor apparatus
[0074] 2 Sensor element
[0075] 3 Processor controller
[0076] 4 Central control device
[0077] 5 Object
[0078] 6 Direction of travel
[0079] 7 Vehicle
[0080] 8 Inner side of an object
[0081] 9 Object travel region
[0082] 10 Boundary line
[0083] 11 Trajectory
[0084] 12 Lateral edge of the driver's own vehicle
[0085] 13 Trajectory
[0086] 14 Vehicle travel region
[0087] 15 Trajectory
[0088] 16 Trajectory
[0089] 17 Edge of the road
[0090] b Width
[0091] The foregoing disclosure has been set forth merely to illustrate the invention and is not intended to be limiting. Since modifications of the disclosed embodiments incorporating the spirit and substance of the invention may occur to persons skilled in the art, the invention should be construed to include everything within the scope of the appended claims and equivalents thereof.