METHOD FOR CALIBRATING A YAW RATE SENSOR OF A VEHICLE
20230356725 · 2023-11-09
Assignee
Inventors
Cpc classification
B60W60/00
PERFORMING OPERATIONS; TRANSPORTING
G01C21/188
PHYSICS
International classification
Abstract
A method for calibrating a yaw rate sensor of a vehicle, comprises detecting a yaw rate of the vehicle from measurement data from the yaw rate sensor. A change in yaw angle is ascertained from sensor data from at least one optical surroundings sensor unit, wherein an offset of the yaw rate sensor is ascertained, the offset being ascertained by fusion of the detected yaw rate and the ascertained change in yaw angle. The yaw rate sensor is calibrated according to the ascertained offset.
Claims
1. A method for calibrating a yaw rate sensor of a vehicle comprising: detecting a yaw rate of the vehicle from measurement data from the yaw rate sensor, ascertaining a change in yaw angle from sensor data from at least one optical surroundings sensor unit, ascertaining an offset of the yaw rate sensor by fusion of the detected yaw rate and the ascertained change in yaw angle, and calibrating the yaw rate sensor according to the ascertained offset.
2. The method as claimed in claim 1, further comprising determining a change in orientation of the vehicle by fusion of the ascertained yaw rate and the ascertained change in yaw angle with a Kalman filter.
3. The method as claimed in claim 1, further comprising ascertaining a fusioned yaw rate and a fusioned change in yaw angle from the fusion of the detected yaw rate and the ascertained change in yaw angle.
4. The method as claimed in claim 1, wherein the detecting the yaw rate is performed periodically with a first period duration and the ascertaining the change in yaw angle is performed periodically with a second period duration, which is different from the first period duration.
5. The method as claimed in claim 1, wherein the fusion of the yaw rate and the change in yaw angle is performed periodically with a fusion period duration and the ascertaining the offset further comprises ascertaining from a plurality of periodically and successively ascertained measured values relating to the yaw rate and change in yaw angle.
6. The method as claimed in claim 1, further comprising ascertaining a respective image position of at least one image feature in successive frames of an image sequence generated by at least one vehicle camera of the at least one optical surroundings sensor unit, and ascertaining the change in yaw angle on the basis of the change in the image position of the at least one image feature between recording times of the frames.
7. The method as claimed in claim 1, wherein an offset compensation takes place at the detected yaw rate.
8. A computer program for calibrating a yaw rate sensor of a vehicle, wherein the computer program product comprises instructions comprising: detecting a yaw rate of the vehicle from measurement data from the yaw rate sensor, ascertaining a change in yaw angle from sensor data from at least one optical surroundings sensor unit, ascertaining an offset of the yaw rate sensor by fusion of the detected yaw rate and the ascertained change in yaw angle, and calibrating the yaw rate sensor is calibrated according to the ascertained offset.
9. A device for calibrating a yaw rate sensor of a vehicle, wherein the device has a processing unit with instructions comprising: detecting a yaw rate of the vehicle from measurement data from the yaw rate sensor, ascertaining a change in yaw angle from sensor data from at least one optical surroundings sensor unit, ascertaining an offset of the yaw rate sensor by fusion of the detected yaw rate and the ascertained change in yaw angle, and calibrating the yaw rate sensor is calibrated according to the ascertained offset.
10. The device as claimed in claim 9 wherein the device is in a vehicle.
11. The method of claim 1, further comprising determining the change in orientation of the vehicle based on fusion of a corrected yaw rate and a visual odometry.
12. The method of claim 11, further comprising carrying out one of semi-automated and fully automated guidance of the vehicle along an ascertained ego trajectory based on the determined change in orientation.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0025] The invention is described in more detail below with reference to expedient exemplary embodiments. In this case:
[0026]
[0027]
[0028]
[0029]
DETAILED DESCRIPTION
[0030] Reference numeral 1 in
[0031] The yaw rate sensor 2 of the vehicle 1 is designed to provide measurement data for a yaw rate ψgyro of the vehicle 1. Furthermore, the vehicle 1 comprises at least one vehicle camera 3 that captures an image sequence. In particular, the successive frames of the image sequence are used to ascertain a yaw rate ψvisu and a change in yaw angle Δψvisu of the vehicle 1. A new orientation ψ of the vehicle 1 is determinable as a result of a change Δψvisu. For example, the vehicle camera 3 comprises an evaluation unit designed to ascertain the change in yaw angle Δψvisu of the vehicle 1 from the image sequence. The change in yaw angle Δψvisu is determined for example by first combining the frames to produce an overall image and taking the latter as a basis for performing an evaluation.
[0032] The vehicle 1 comprises e.g. a processing unit 4 designed to ascertain an offset ψgyro offset of the yaw rate sensor 2. The detected yaw rate ψgyro and change in yaw angle Δψvisu are e.g. transferred to the processing unit 4, the processing unit 4 being designed to ascertain the offset and to fusion the detected yaw rate ψgyro and the change in yaw angle Δψvisu. The measured yaw rate ψgyro is thus taken as a basis for ascertaining a state variable ψfus and consequently a corrected yaw rate that at least closely matches the actual yaw rate of the vehicle. Compensating for the offset allows determination of the vehicle orientation in a vehicle environment model and, based thereon, the semi- or fully automated guidance of the vehicle 1 along a planned ego trajectory.
[0033] The fusion of the yaw rate ψgyro in a Kalman filter 5 takes place on the basis of the yaw rate sensor 2 and the change in yaw angle Δψvisu on the basis of the at least one vehicle camera 3. Fusion of the distorted measurement signal from the yaw rate sensor 2 and the yaw rate ψvisu calculated by means of visual odometry by way of a fusion frame as the Kalman filter 5 allows the offset of the yaw rate sensor 2 to be ascertained during vehicle movement.
[0034]
[0035] The yaw rate ψ.sub.gyro is ascertained from the measurement data from the yaw rate sensor 2, updated with an update time t.sub.gyro of the yaw rate sensor 2 and supplied to the Kalman filter 5. The change in yaw angle Δψvisu is ascertained from the measurement data from the at least one vehicle camera 3 and updated with an update time t.sub.cam. Purely by way of illustration, the update time t.sub.gyro of the yaw rate sensor is 10 ms and the update time t.sub.cam of the vehicle camera is 33 ms, which results from the frame rate of the at least one vehicle camera 3 of 30 frames/s.
[0036] After the prediction step 14, a decision is made in which it is determined whether the correction step 7 should also be performed. This involves checking whether the change in yaw angle Δψvisu has been updated since the last update, that is to say during execution of the Kalman filter 5.
[0037] The yaw rate sensor 2 and the at least one vehicle camera 3 transfer the measurement data at a different update rate. As already mentioned hereinabove, the update rate for the camera images is 30 frames/s and that for the measurement data from the yaw rate sensor is 20 ms. The fusion time for the data is 10 ms, for example. While the change in orientation between two different updates of the fusion is needed for further processing, the visual odometry provides the change in orientation within 33 ms. Against this background, the state Δψ was introduced in the Kalman filter 5, said state representing the fusioned change in yaw angle Δψvisu from an update of the last update of the visual odometry to the present time. This means that this state is set to zero after each update. For a new update of the visual odometry, this state contains the estimated change in yaw angle for 30 ms or for 40 ms. However, the visual odometry always provides the change in yaw angle Δψvisu at update rates of 33 ms. For this reason, e.g. the change in yaw angle Δψvisu is extrapolated for 7 ms or reduced for 3 ms. The change in yaw angle Δψvisu is then ready for fusioning.
[0038] The update of the yaw rate ψ.sub.gyro and the change in yaw angle Δψ.sub.visu is illustrated in
[0039] As an addition to
[0040] According to a first exemplary embodiment, a fusion cycle comprises fusion of an updated change in yaw angle Δψ.sub.visu of the at least one vehicle camera 3, which was detected at time C1, and an updated yaw rate of the yaw rate sensor 2, which was detected at time B1, at time A1.
[0041] According to a second exemplary embodiment, a fusion cycle comprises detecting an updated change in yaw rate Δψ.sub.visu of the at least one vehicle camera at time C2, no updated yaw rate ψ.sub.gyro of the yaw rate sensor 2 having been detected since the last fusion cycle. In such a case, the last detected yaw rate ψ.sub.gyro of yaw rate sensor 2 is extrapolated at time B2 and finally fusioned with the change in yaw angle Δψ.sub.visu detected at time C2.
[0042] According to a third exemplary embodiment, a fusion cycle comprises detecting an updated yaw rate ψ.sub.gyro of the yaw rate sensor 2 at time B3, no updated change in yaw angle Δψ.sub.visu of the vehicle camera 3 having been detected since the last fusion cycle. In such a case, no update of the fusion state is performed.
[0043] The embodiments it possible to learn the offset of the yaw rate sensor 2 during vehicle movement by fusion of the yaw rate ψvisu calculated and output by means of the visual odometry with the yaw rate ψ.sub.gyro measured by the yaw rate sensor 2. The method described enables the vehicle orientation to be determined by fusion of the data from the yaw rate sensor 2 and the at least one vehicle camera 3.