Method For Determining an Orientation Angle of Inertial Sensors To One Another

20230228791 · 2023-07-20

    Inventors

    Cpc classification

    International classification

    Abstract

    A method for determining the orientation of at least two inertial sensors in a device or between at least two devices, each having at least one inertial sensor, includes a) receiving first raw acceleration data and/or rotation rate data of a first inertial sensor in three directions during regular operation of the device; b) simultaneously to step a), receiving second raw acceleration data and/or rotation rate data of a second inertial sensor in three directions during regular operation of the device; c) time-synchronizing the first and second raw acceleration data and/or rotation rate data so that the time-synchronized raw acceleration data of the first inertial sensor and of the second inertial sensor are generated; and d) calculating relative orientation angles in three spatial directions between the first inertial sensor and the second inertial sensor with the time-synchronized raw acceleration data.

    Claims

    1. A method for determining the orientation of at least two inertial sensors in a device or between at least two devices that each have at least one inertial sensor, to one another, the method comprising: receiving first raw acceleration data and/or rotation rate data from a first inertial sensor in three directions during regular operation of the device; simultaneously to receiving the first raw acceleration data and/or rotation rate data, receiving second raw acceleration data and/or rotation rate data from a second inertial sensor in three directions during regular operation of the device; time-synchronizing the first raw acceleration data and/or rotation rate data of the first inertial sensor and the second raw acceleration data and/or rotation rate data of the second inertial sensor so that the time-synchronized raw acceleration data and/or rotation rate data of the first inertial sensor and of the second inertial sensor are generated; and calculating relative orientation angles in three spatial directions between the first inertial sensor and the second inertial sensor with the time-synchronized raw acceleration data and/or rotation rate data.

    2. The method according to claim 1, wherein the regular operation of the device is the use of the finished device in a test operation performed individually for each individual device.

    3. The method according to claim 1, wherein the regular operation of the device is an operating phase of the device shortly after the regular initial start-up of the device by an end user.

    4. The method according to claim 1, wherein the regular operation of the device is any operating phase of the device during operation of the device in its intended use.

    5. The method according to claim 1, wherein in the calculating of the relative orientation angles, the synchronized raw acceleration data are established in the form of a first vector {right arrow over (f)}.sub.ib,A.sup.b=(f.sub.x, f.sub.y, f.sub.z).sub.ib.sup.b for the first inertial sensor and a second vector {right arrow over (f)}.sub.ib,B.sup.b(f.sub.x, f.sub.y, f.sub.z).sub.ib.sup.b for the second inertial sensor, and the relative orientation of the first and second inertial sensors is described by a matrix Ĉ.sub.b,B.sup.b,A according to the formula:
    {right arrow over (f)}.sub.ib,A.sup.b.sub.b,B.sup.b,A{right arrow over (f)}.sub.ib,B.sup.b

    6. The method according to claim 5, wherein an angle-dependent residual vector r(ϕ, θ, ψ) is calculated according to the following formula to estimate the matrix Ĉ.sub.b,B.sup.b,A:
    r(ϕ,θ,ψ)=Ĉ.sub.b,B.sup.b,A{right arrow over (f)}.sub.ib,B.sup.b−{right arrow over (f)}.sub.ib,A.sup.b, wherein a square error function is established based on the angle-dependent residual vector as: f ( ϕ , θ , ψ ) = 1 2 r ( ϕ , θ , ψ ) r ( ϕ , θ , ψ ) , wherein a Jacobi matrix is established based on this square error function as: J = ( f ( ϕ , θ , ψ ) ψ f ( ϕ , θ , ψ ) θ f ( ϕ , θ , ψ ) ψ ) , and wherein the Jacobi matrix is estimated using an iterative estimation method.

    7. The method according to claim 1, wherein the calculation of the relative orientation angles includes using an iterative Gauss-Newton estimator (GN estimator) to calculate the relative orientation angles.

    8. The method according to claim 1, wherein the calculation of the relative orientation angles includes using an iterative Levenberg-Marquardt estimator (LM estimator) to calculate the relative orientation angles.

    9. The method according to claim 1, wherein the time synchronizing of the first and second raw acceleration data and/or rotation rate data takes place via a GNSS time parameter, which the first inertial sensor and the second inertial sensor respectively receive via an associated GNSS receiver.

    10. The method according to claim 1, wherein the time synchronizing of the first and second raw acceleration data and/or rotation rate data takes place with an auto-correlation function that processes the first raw acceleration data and the second raw acceleration data.

    11. The method according to claim 1, wherein: the receiving of the first raw acceleration data and/or rotation rate data includes receiving rotation rate data of the first inertial sensor in three rotation directions, and the receiving of the second raw acceleration data and/or rotation rate data includes receiving the rotation rate data of the second inertial sensor in three rotation directions.

    12. The method according to claim 1, further comprising: receiving third raw acceleration data from at least one third inertial sensor arranged in the device, wherein the time-synchronizing further includes synchronizing the third raw acceleration data with the first and second raw acceleration data and/or rotation rate data, and wherein the calculating of the relative orientation angles further includes calculating relative orientation angles in three spatial directions between the first inertial sensor and the third inertial sensor and between the second inertial sensor and the third inertial sensor based on the further raw acceleration data.

    13. A device comprising: a first inertial sensor; and a second inertial sensor, wherein the device is configured to receive first raw acceleration data and/or rotation rate data from the first inertial sensor in three directions during regular operation of the device; simultaneously to receiving the first raw acceleration data and/or rotation rate data, receive second raw acceleration data and/or rotation rate data from the second inertial sensor in three directions during regular operation of the device; time-synchronize the first raw acceleration data and/or rotation rate data of the first inertial sensor and the second raw acceleration data and/or rotation rate data of the second inertial sensor so that the time-synchronized raw acceleration data and/or rotation rate data of the first inertial sensor and of the second inertial sensor are generated; and calculate relative orientation angles in three spatial directions between the first inertial sensor and the second inertial sensor with the time-synchronized raw acceleration data and/or rotation rate data.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0059] The disclosure and the technical environment are explained in further detail below with reference to the figures. The figures show preferred exemplary embodiments to which the disclosure is however not limited. Shown are:

    [0060] FIG. 1: a flow chart of the described method;

    [0061] FIG. 2: a device for performing the described method; and

    [0062] FIGS. 3 and 4: schematically, experimental results which explain the ability to perform the described method.

    DETAILED DESCRIPTION

    [0063] In FIG. 1, the flow of the described method is shown schematically. The method steps a), b), c) and d), which are performed in succession, can be seen, wherein step b2) is optionally performed (in the case of further inertial sensors/more than two inertial sensors).

    [0064] FIG. 2 shows a described device 1 for performing the described method. The device 1 here has a total of three sensor modules 5, 6 and 8, namely a first sensor module 5, a second sensor module 6 and a further sensor module 8. The first sensor module 5 has a first inertial sensor 2. The first sensor module 6 has a second inertial sensor 3. The further sensor module 8 is representative of a flexible number of sensor modules, each of which likewise comprises further inertial sensors 7. According to this illustration, all sensor modules 5, 6 and 8 each have a GNSS receiver 4. However, this need not be the case. The GNSS receiver 4 is in particular not required if a time correlation of the raw data 12 determined with the inertial sensors 2, 3 and 7 takes place by means of auto-correlation techniques. The raw data 12 are each transferred to the orientation estimator 9, which determines the orientation angle vector 11 using the method described herein. The orientation angle vector 11 can subsequently be used in further data processing 10 in the device 1 to particularly precisely process the acceleration and rotation rate data determined with the inertial sensors 2, 3 and 7.

    [0065] In FIG. 3 and FIG. 4, 6 raw data parameters are respectively compared, namely the accelerations of two inertial sensors IMU A and IMU B, in each case in the three spatial directions X, Y and Z.

    [0066] In FIG. 3, a total of 6 recordings can be seen because, due to a manufacturing-related, very small deviation of the spatial directions of the sensors to another (ϕ, θ, ψ)=(−0.22059,0.36478, −1.2127), respectively given in angular degree (highest deviation for the angle ψ of −1.2127° (angular degree), the curves of the recordings of the two inertial sensors IMU A and IMU B for X, Y and Z are not exactly congruent.

    [0067] Although FIG. 4 also depicts 6 recordings, only 3 recordings can be seen because the recordings for X, Y and Z are respectively exactly congruent. This was achieved by correcting the X, Y and Z recordings from IMU A and IMU B with one another with the aid of the matrix Ĉ.sub.b,B.sup.b,A obtained using the method described herein.