Correcting a position of a vehicle with SLAM
11628857 · 2023-04-18
Assignee
Inventors
Cpc classification
G01S2015/935
PHYSICS
G01C22/00
PHYSICS
B60W30/06
PERFORMING OPERATIONS; TRANSPORTING
G05D1/0272
PHYSICS
B60W60/001
PERFORMING OPERATIONS; TRANSPORTING
B60W40/12
PERFORMING OPERATIONS; TRANSPORTING
International classification
B60W60/00
PERFORMING OPERATIONS; TRANSPORTING
B60W30/06
PERFORMING OPERATIONS; TRANSPORTING
G01C22/00
PHYSICS
Abstract
A method for correcting a position of a vehicle when parking in a parking space. The method includes determining the position of the vehicle on the basis of odometry information, sensing ultrasonic signals from a linear object, carrying out a method for simultaneous localization and mapping (SLAM) on the basis of the linear object and the ultrasonic signals, and correcting the position of the vehicle A control device for a driving support system of a vehicle is also disclosed, which is designed to receive odometry information of the vehicle, to receive ultrasonic signals from at least one ultrasonic sensor of the driving support system, and to carry out the aforementioned method. A driving support system for a vehicle with an aforementioned control device and with at least one ultrasonic sensor is disclosed. The invention likewise relates to a vehicle with an aforementioned driving support system.
Claims
1. A method for correcting a position of a vehicle when parking in a parking space, the method comprising: generating an estimate of the position of the vehicle on the basis of odometry information of the vehicle; sensing ultrasonic signals from a linear object; sensing the linear object in a longitudinal direction of the vehicle by at least one ultrasonic sensor; carrying out a method for simultaneous localization and mapping (SLAM) of the vehicle in a map of an environment surrounding the vehicle on the basis of the linear object and the ultrasonic signals, wherein the SLAM is performed based on the ultrasound signals providing distance information only by correlating the ultrasound signals with one another; and correcting the estimate of the position of the vehicle in the map on the basis of odometry information by the simultaneous localization and mapping on the basis of the linear object.
2. The method according to claim 1, wherein sensing the linear object in the longitudinal direction of the vehicle comprises verifying the sensed ultrasonic signals as belonging to the linear object.
3. The method according to claim 1, wherein sensing the linear object in the longitudinal direction of the vehicle comprises sensing a plurality of linear objects, and the method further comprises assigning the sensed ultrasonic signals to one of the plurality of linear objects.
4. The method according to claim 3, wherein verifying the sensed ultrasonic signals as belonging to the linear object and/or the step of assigning the sensed ultrasonic signals to one of the linear objects comprises a determination of a Mahalanobis distance.
5. The method according to claim 1, wherein carrying out a method for simultaneous localization and mapping comprises carrying out a Kalman filtering comprising: generating an initial estimate of the position of the vehicle using an odometry function, based on the odometry information; and correcting the initial estimate of the position on the basis of the linear object.
6. The method according to claim 5, further comprising determining the odometry parameters of the vehicle.
7. The method according to claim 6, wherein determining the odometry parameters comprises adapting the odometry function of the Kalman filter.
8. A control device for a driving support system of a vehicle which is configured to: receive odometry information of the vehicle; receive ultrasonic signals from at least one ultrasonic sensor of the driving support system; and carry out the method according to claim 1.
9. A driving support system for a vehicle comprising a control device according to claim 8; and at least one ultrasonic sensor.
10. A vehicle with a driving support system according to claim 9.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1) The invention is explained in more detail below with reference to the attached drawing and on the basis of preferred embodiments. The features described can represent an aspect of the invention both individually and in combination. Features of different exemplary embodiments can be transferred from one exemplary embodiment to another.
(2) In the figures:
(3)
(4)
(5)
DETAILED DESCRIPTION
(6) The vehicle 10 comprises a driving support system 12, which in this exemplary embodiment is designed for parking the vehicle 10 in a parking space 14.
(7) The driving support system 12 comprises a control device 16, which is formed here by an electronic control unit (ECU) of the vehicle 10. In addition, the driving support system 12 comprises a plurality of ultrasonic sensors 18. In the present exemplary embodiment, the driving support system 12 comprises fourteen ultrasonic sensors 18, six of which are arranged in a front region 20 and six of which are arranged in a rear region 22 of the vehicle 10. The ultrasonic sensors 18 in the front region 20 and in the rear region 22 of the vehicle 10 are attached to its bumpers. In addition, an ultrasonic sensor 18 is arranged on each side 24 of the vehicle 10.
(8) Each of the ultrasonic sensors 18 is designed to send ultrasonic pulses into a surrounding environment 26 of the vehicle 10 and to receive reflections of the ultrasonic pulses produced by objects 28 in the surrounding environment 26. For example, in
(9) The ultrasonic sensors 18 are in each case connected to the control device 16 by way of a data bus (not shown here) for data transmission. Sensor information produced by the ultrasonic sensors 18 is transmitted via the data bus to the control device 16, and is jointly evaluated and further processed there.
(10) Furthermore, the control device 16 is designed to receive odometry information of the vehicle 10.
(11) A method according to a second embodiment for correcting a position of the vehicle 10 when parking in the parking space 14 is described below with reference to
(12) The method begins in step S100 with sensing of the linear objects 28 in the longitudinal direction of the vehicle 10 by the ultrasonic sensor 18 positioned on the corresponding side 24 of the vehicle 10. The sensing of the linear objects 28 is performed when the vehicle 10 is driving past the parking space 14. The parking space 14 is designed here for parallel parking, and the linear objects 28 are formed here by a base edge and a kerbstone.
(13) In the subsequent step S110, the position of the vehicle 10 is determined on the basis of odometry information of the vehicle 10. The determination is performed on the basis of information obtained by the vehicle 10 itself on the basis of its odometry sensors, i.e. a steering angle and revolutions of the wheels of the vehicle 10.
(14) Step S120 concerns sensing of ultrasonic signals from the linear objects 28. As stated above, ultrasonic pulses are emitted by the corresponding ultrasonic sensor 18 and are reflected at the linear object 28, and the reflections are received again by the ultrasonic sensor 18. The ultrasonic sensor 18 determines a distance in relation to the linear object 28 from a transit time of the sound from the ultrasonic sensor 18 to the linear object 28 and back again to the ultrasonic sensor 18.
(15) In a step S130, the sensed ultrasonic signals are assigned to one of the linear objects 28. It is additionally verified whether the ultrasonic signals sensed belong to the linear object 28.
(16) Used in each case for this purpose is a Mahalanobis distance, which provides a distance between points in a multi-dimensional vector space. In the case of multi-variant distributions, m coordinates of a point are represented as an m-dimensional column vector, which is regarded as a realization of a random vector X with the covariance matrix Z. A distance between two points x and y distributed in this way is then determined by the Mahalanobis distance.
(17) Step S140 concerns carrying out a method for simultaneous localization and mapping (SLAM) on the basis of the linear object 28 and the ultrasonic signals. The vehicle position on the basis of the odometry information and odometry parameters are additionally used for this.
(18) The method for simultaneous localization and mapping comprises carrying out a Kalman filtering on the basis of a state-space modelling, in which a distinction is explicitly made between the dynamics of the system state and the process of its measurement. The estimate of the state is in this case based on a knowledge of earlier observations as obtained by the linear object 28.
(19) For a parking place 14 with n linear objects 28, normally fewer than two (n<2), let x.sub.k−1.sup.V=[x.sub.k−1 y.sub.k−1 θ.sub.k−1].sup.T be the position vector of the vehicle 10 in relation to the parking place 14. Furthermore, a set of odometry parameters is specified as a vector p. In addition, let x.sub.k−1.sup.i be the position of the ith linear object 28 at the point in time k−1. This results in the Kalman state according to
(20)
(21) Let g be the odometry function, which calculates a curve-shaped displacement vector v=[s.sub.k u.sub.θ,k].sup.T from the parameters p, the ticks of the wheels, the received steering angle and the noise vector q=[q.sub.s q.sub.θ].sup.T according to
v=g(p)+q
(22) The covariance of the disturbance vector is
(23)
(24) In step S150, the position of the vehicle 10 as determined on the basis of odometry information is corrected by the simultaneous localization and mapping on the basis of the linear object 28.
(25) The covariance of the disturbance vector gives a Cartesian displacement vector as
(26)
(27) The new position of the vehicle is consequently obtained as:
(28)
(29) The aforementioned position of the vehicle results in the prediction function for the ith feature of the state in the case of a linear object 28 as:
(30)
where x.sub.k=1.sup.i(j) is the ith component of the ith feature of the state (x axis, y axis and angle).
(31) Finally, in step S160, odometry parameters of the vehicle 10 are determined. As evident from the considerations set out above, the function ƒ.sup.i implicitly includes the odometry function g, which comprises the parameter p as the state. This correspondingly allows the odometry parameters of the vehicle 10 to be determined. Correspondingly, wheel circumferences of the vehicle 10 and a steering angle conversion table can be adapted as odometry parameters.
(32) Steps S140, S150 and S160 are in the present case carried out in parallel, and the position of the vehicle 10 is continuously corrected on the basis thereof.
LIST OF REFERENCE SIGNS
(33) 10 Vehicle 12 Driver support system 14 Parking space 16 Control device 18 Ultrasonic sensor 20 Front region 22 Rear region 24 Side 26 Surrounding environment 28 Linear object, object