POSTURE ESTIMATION METHOD, POSTURE ESTIMATION DEVICE, AND VEHICLE
20230236220 · 2023-07-27
Inventors
Cpc classification
G01P21/00
PHYSICS
G01P3/00
PHYSICS
International classification
Abstract
A posture estimation method includes calculating a posture change amount of an object based on an output of an angular velocity sensor, predicting posture information of the object by using the posture change amount, limiting a bias error in a manner of limiting a bias error component of an angular velocity around a reference vector in error information, and correcting the predicted posture information of the object based on the error information, the reference vector, and an output of a reference observation sensor.
Claims
1. A posture estimation method comprising: calculating a posture change amount of the electronic device with a processor based on an output from an angular velocity sensor and an output from an acceleration sensor of an inertial measurement unit included in the electronic device; predicting posture quaternion of the electronic device with the processor by using the posture change amount; adjusting an error covariance matrix of the posture quaternion with the processor by increasing a motion velocity error component in the error covariance matrix and by reducing a covariance component between the motion velocity error component and an error component other than the motion velocity error component when the output of the angular velocity sensor or the output of the acceleration sensor is off-scale; and correcting the predicted posture quaternion of the electronic device with the processor based on the error covariance matrix of the posture quaternion.
2. The posture estimation method according to claim 1, wherein the error component other than the motion velocity error component includes a bias error component of the acceleration sensor.
3. The posture estimation method according to claim 1, wherein the error component other than the motion velocity error component includes a posture error component, a bias error component of an angular velocity, a bias error component of the acceleration, and an error component of the gravitational-acceleration correction value.
4. The posture estimation method according to claim 4, wherein the a covariance component between the motion velocity error component and an error component other than the motion velocity error component is set to zero when the output of the angular velocity sensor or the output of the acceleration sensor is off-scale.
5. A posture estimation device comprising: a posture-change-amount calculation unit that calculates a posture change amount of the electronic device with a processor based on an output from an angular velocity sensor and an output from an acceleration sensor of an inertial measurement unit included in the electronic device; a posture information prediction unit that predicts posture quaternion of the electronic device with the processor by using the posture change amount; an error information adjustment unit that adjusts an error covariance matrix of the posture quaternion with the processor by increasing a motion velocity error component in the error covariance matrix and by reducing a covariance component between the motion velocity error component and an error component other than the motion velocity error component when the output of the angular velocity sensor or the output of the acceleration sensor is off-scale; and a posture information correction unit that corrects the predicted posture quaternion of the electronic device with the processor based on the error covariance matrix of the posture quaternion.
6. A vehicle comprising: a posture estimation device having: a posture-change-amount calculation unit that calculates a posture change amount of the electronic device with a processor based on an output from an angular velocity sensor and an output from an acceleration sensor of an inertial measurement unit included in the electronic device; a posture information prediction unit that predicts posture quaternion of the electronic device with the processor by using the posture change amount; an error information adjustment unit that adjusts an error covariance matrix of the posture quaternion with the processor by increasing a motion velocity error component in the error covariance matrix and by reducing a covariance component between the motion velocity error component and an error component other than the motion velocity error component when the output of the angular velocity sensor or the output of the acceleration sensor is off-scale; a posture information correction unit that corrects the predicted posture quaternion of the electronic device with the processor based on the error covariance matrix of the posture quaternion; and a control device that controls a posture of the vehicle based on posture quaternion of the vehicle, which is estimated by the posture estimation device.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0015]
[0016]
[0017]
[0018]
[0019]
[0020]
[0021]
[0022]
[0023]
[0024]
[0025]
[0026]
[0027]
[0028]
DESCRIPTION OF EXEMPLARY EMBODIMENTS
[0029] Hereinafter, a preferred embodiment according to the present disclosure will be described in detail with reference to the drawings. Embodiments described below do not unduly limit the contents of the present disclosure described in the appended claims. All components described below are not essential components in the present disclosure.
1. Posture Estimation Method
1-1. Posture Estimation Theory
1-1-1. IMU Output Model
[0030] The output of the inertial measurement unit (IMU) includes angular velocity data d.sub.ω, k as an output of a three-axis angular velocity sensor and acceleration data d.sub.α, k as an output of a three-axis acceleration sensor at each sampling time point (t.sub.k) Here, as shown in Expression (1), the angular velocity data d.sub.ω, k is represented by the sum of an average value of an angular velocity vector ω in a sampling interval (Δt=t.sub.k−t.sub.k-1) and the residual bias b.sub.ω.
[0031] Similarly, as shown in Expression (2), the acceleration data d.sub.α, k is also represented by the sum of an average value of an acceleration vector α and the residual bias b.sub.α.
1-1-2. Calculation of Three-Dimensional Posture by Angular Velocity Integration
[0032] When a three-dimensional posture is represented by quaternions, a relation between the posture quaternion q and the angular velocity vector ω [rad/s] is represented by a differential equation in Expression (3).
[0033] Here, a symbol obtained by superimposing O and x on each other indicates quaternion multiplication. For example, elements of quaternion multiplication of q and p are calculated as shown in Expression (4).
[0034] As shown in Expression (5), the angular velocity vector ω is considered as being equivalent to a quaternion in which the real (scalar) component is zero, and the imaginary (vector) component coincides with the component of ω.
[0035] If the differential equation (3) is solved, it is possible to calculate the three-dimensional posture. However, unfortunately, the general solution thereof has not been found. Further, the value of the angular velocity vector ω is also obtained in only a form of a discrete average value. Thus, it is necessary that approximation calculation is performed with Expression (6) for each short (sampling) time.
[0036] Expression (6) is an expression calculated based on Taylor expansion at each of t=t.sub.k-1 to the third-order term of Δt, considering the posture quaternion q and an integration relation of the angular velocity vector ω for each axis. The term including ω.sub.k-1 in the expression corresponds to the Corning correction term. The symbol x indicates a cross product (vector product) of a three-dimensional vector. For example, the elements of v×w are calculated as in Expression (7).
1-1-3. Tilt Error Observation by Gravitational Acceleration
[0037] The acceleration sensor detects an acceleration generated by the movement thereof. However, on the earth, the acceleration is normally detected in a state of adding a gravitational acceleration of about 1 [G] (=9.80665 [m/s.sup.2]). The gravitational acceleration is normally a vector in a vertical direction. Thus, it is possible to know an error of a tilt (roll and pitch) component of the posture by comparison to the output of the three-axis acceleration sensor. Therefore, firstly, it is necessary that an acceleration vector α in the sensor coordinate system (xyz coordinate system), which is observed by the three-axis acceleration sensor is transformed to an acceleration vector α′ in a coordinate system (XYZ coordinate system) of a local space obtained by horizontal orthogonal axes and a vertical axis. The coordinate (rotation) transformation can be calculated with the posture quaternion q and conjugate quaternion q*, as shown in Expression (8).
α′=q.Math.α.Math.q* (8)
[0038] Expression (8) can be expressed with a three-dimensional coordinate transformation matrix C, as in Expression (9).
[0039] A tilt error is obtained by comparing the acceleration vector α′ to the gravitational acceleration vector g in the local-space coordinate system (XYZ coordinate system). The gravitational acceleration vector g is represented by Expression (10). In Expression (10), Δg indicates a gravitational-acceleration correction value indicating a difference [G] from the standard value of the gravitational acceleration vector g.
1-1-4. Observation of Zero Motion Velocity
[0040] In particular, the motion velocity of the IMU is considered to be substantially equal to zero in a long term, in user interface applications. A relation between the motion velocity vector v in the local-space coordinate system, and the acceleration vector α and the angular velocity vector ω in the sensor coordinate system is expressed with the coordinate transformation matrix C by a differential equation in Expression (11).
[0041] Here, the values of the acceleration vector α and the angular velocity vector ω are obtained only in a form of a discrete average value. Thus, the motion velocity vector is calculated by performing approximation calculation with Expression (12) for each short (sampling) time.
[0042] Expression (12) is an expression calculated based on Taylor expansion at each of t=t.sub.k-1 to the third-order term of Δt, considering the motion velocity vector v and integration relations of the acceleration vector α and the angular velocity vector ω for each axis. The third-order term is ignored in Expression (12) because of being sufficiently small although the residual error ε.sub.λ is provided in the third-order term. The symbol .Math. indicates a dot product (scalar product) of a three-dimensional vector. For example, v-w is calculated as in Expression (13).
v.Math.w=v.sub.ww.sub.x+v.sub.yw.sub.y+v.sub.zw.sub.z (13)
1-1-5. Posture Quaternion and Error Thereof
[0043] It is considered that the true value ({circumflex over ( )}) of the calculated posture quaternion q has an error ε.sub.q, as in Expression (14).
[0044] Here, Σ.sub.q.sup.2 represents an error covariance matrix indicating the magnitude of the error ε.sub.q. E[.Math.] indicates an expected value. T on the right shoulder indicates transposition of the vector.Math.matrix.
[0045] The quaternion and the error have four values, but there are just three degrees of freedom in a three-dimensional posture (rotational transformation). The fourth degree of freedom of the posture quaternion corresponds to enlargement/reduction conversion. However, it is necessary that the enlargement/reduction ratio is normally fixed to 1 in posture detection processing. In practice, a enlargement/reduction ratio component changes by various calculation errors. Thus, processing of suppressing the change of the enlargement/reduction ratio is required.
[0046] In a case of the posture quaternion q, the square of the absolute value thereof corresponds to the enlargement/reduction ratio. Thus, the change is suppressed by normalization processing in which the absolute value is set to 1, as in Expression (15).
[0047] In a case of the posture error ε.sub.q, it is necessary to hold the rank (order number) of the error covariance matrix Σ.sub.q.sup.2 to be three. Thus (assuming that a posture error angle is sufficiently small), the rank is limited as in Expression (16), considering a (three-dimensional) error rotation vector ε.sub.θ in the local-space coordinate system.
1-1-6. Removal (Ignoring) of Azimuth Error
[0048] When an azimuth observation section such as a magnetic sensor is not provided in posture detection, an azimuthal component of the posture error just monotonously increases and does not serve for any purpose. Further, the increased error estimate causes a feedback gain to unnecessarily increase, and this causes the azimuth to unexpectedly change or vary. Thus, an azimuth error component ε.sub.θ.sub.
1-1-7. Extended Kalman Filter
[0049] An extended Kalman filter that calculates a three-dimensional posture based on the above model expressions can be designed.
State Vector and Error Covariance Matrix
[0050] As in Expression (18), the posture quaternion q, the motion velocity vector v, the residual bias b.sub.ω (offset to angular velocity vector ω) of the angular velocity sensor, the residual bias b.sub.α (offset to acceleration vector α) of the acceleration sensor, and the gravitational-acceleration correction value Δg, as unknown state values to be obtained, constitute a state vector x (14-dimensional vector) of the extended Kalman filter. In addition, the error covariance matrix Σ.sub.x.sup.2 is defined.
Process Model
[0051] In a process model, the value of the latest state vector is predicted based on the sampling interval Δt and values of the angular velocity data d.sub.ω and the acceleration data d.sub.α, as in Expression (19).
[0052] The covariance matrix of a state error is updated as in Expression (20) by receiving influences of noise components η.sub.ω and η.sub.α of the angular velocity data d.sub.ω, and the acceleration data d.sub.α, and process noise ρ.sub.ω, ρ.sub.α, and ρ.sub.g indicating instability of the residual bias b.sub.ω of the angular velocity sensor, the residual bias b.sub.α of the acceleration sensor, and the value (gravitational acceleration value) of the gravitational acceleration vector g.
[0053] Here, 0.sub.n×m indicates a zero matrix having n rows and m columns. I.sub.n×m indicates an identity matrix having rows and m columns. J . . . indicates a matrix of propagation coefficients of each error obtained by partial differentiation of the process model, as with Expression (21). Σ . . . indicates a covariance matrix of each type of noise as with Expression (22).
Observation Model
[0054] In an observation model, an observation residual Δz in which an observation residual Δz.sub.a of the gravitational acceleration based on acceleration data d.sub.α and an observation residual Δz.sub.v of the zero motion velocity are used as elements is calculated as in Expression (23).
[0055] Here, a Kalman coefficient K is calculated, as in Expression (24), by adding the noise component η.sub.α of the acceleration data d.sub.α and the motion acceleration component ζ.sub.a and the motion velocity component ζ.sub.v as observation errors.
[0056] Here, J . . . is the matrix of propagation coefficients of each error obtained by partial differentiation of the observation model, as with Expression (25). Σ . . . indicates a covariance matrix of each type of noise as with Expression (26).
[0057] Here, μ is a coefficient for calculating the RMS of the error of each axis from the predicted motion acceleration. The state vector x is corrected, as in Expression (27), and the error covariance matrix Σ.sub.x.sup.2 thereof is updated, by using the Kalman coefficient K.
x.sub.k←x.sub.k−K.sub.kz.sub.k
Σ.sub.x,k.sup.2←Σ.sub.x,k.sup.2−K.sub.kH.sub.kΣ.sub.x,k.sup.2 (27)
Posture Normalization Model
[0058] In the posture normalization model, Expression (28) is updated in order to maintain the posture quaternion and the error covariance thereof to proper values.
[0059] Here, D′.sup.TD′ indicates a matrix of Expression (29) for limiting the rank of the posture error and removing the azimuthal component.
1-1-8. Initial Value
State Vector and Error Covariance Matrix
[0060] Initial values of the state vector x and the error covariance matrix Σ.sub.x.sup.2 are given as in Expression (30)
Posture Quaternion
[0061] It is necessary that the posture of the IMU is given in quaternion expression. The posture quaternion q can be calculated from a roll (bank) angle φ [rad], a pitch (elevation) angle θ [rad], and a yaw angle (azimuth) ψ [rad] used for the posture and the like of an aircraft, as in Expression (31).
[0062] The error covariance matrix Σ.sub.q.sup.2 is calculated from RMSσ.sub.φ [rad RMS] of a roll angle error and RMSσ.sub.θ [rad RMS] of a pitch angle error, as in Expression (32) (yaw angle error is ignored).
Motion Velocity Vector
[0063] If the initial state is stationary, the motion velocity vector v may be to be required to 0. The error covariance matrix Σ.sub.v.sup.2 is given based on RMSσ.sub.vx, RMSσ.sub.vy, and RMSσ.sub.vz [Gs RMS] of the errors of the motion velocity vector v in the axes, regardless of being stationary, as in Expression (33).
Residual Biases of Angular Velocity/Acceleration Sensor
[0064] If the residual bias b.sub.ω of the angular velocity sensor and the residual bias b.sub.α of the acceleration sensor are known, the residual biases are required to be appropriately set. When the residual biases are unknown, zero as an expected value is given to the residual biases. Error covariance matrixes Σ.sub.bω.sup.2 and Σ.sub.bα.sup.2 are given based on the errors RMSσ.sub.bωx, RMSσ.sub.bωy, and RMSσ.sub.bωz [rad/s RMS] of the residual biases of the angular velocity sensor in the axes and the errors RMSσ.sub.bαx, RMSσ.sub.bαy, and RMSσ.sub.bαz [G RMS] of the residual biases of the acceleration sensor in the axes, as in Expression (34).
Gravitational-Acceleration Correction Value
[0065] If the gravitational acceleration value is known, a difference from the standard value 1 [G] (=9.80665 [m/s.sup.2]) is required to be appropriately set. When the gravitational acceleration value is unknown, zero as an expected value is given to the gravitational acceleration value. RMSσ.sub.Δg [G RMS] of the error of the gravitational acceleration value is applied to the error covariance matrix.
1-1-9. Setting Value
Sampling Interval
[0066] Since the posture detection processing from the output of the IMS corresponds to a time integration operation in principle, the sampling interval Δt [s] is an important value, and thus is required to be appropriately set.
Output Noise of Angular Velocity/Acceleration Sensor
[0067] The noise components η included in outputs of the angular velocity sensor and the acceleration sensor are considered as the white Gaussian noise of the variance σ.sub.η.sup.2 which is average zero independent for each axis. The magnitude of the noise components is designated by the corresponding RMS values (σ.sub.ηωx, σ.sub.ηωy, σ.sub.ηωz) [rad/s RMS] and (σ.sub.ηαx, σ.sub.ηαy, σ.sub.ηαz) [G RMS], as in Expression (35).
Biases of Angular Velocity/Acceleration Sensor and Instability of Gravitational Acceleration Value
[0068] It is considered that the biases of the angular velocity sensor and the acceleration sensor are not constant and change with time. It is considered that the gravitational acceleration value also slightly changes depending on the surrounding environment. Considering the change as an individual random walk, the instability is designated by (σ.sub.ρωx, σ.sub.ρωy, σ.sub.ρωz) [rad/s/√s], (σ.sub.ραx, σ.sub.ραy, σ.sub.ραz) [G/√s], and σ.sub.ρg[G/√s], as in Expression (36).
Motion Acceleration
[0069] When the gravitational acceleration is observed in order to correct the posture, the motion acceleration component ζ.sub.a acts as the observation error. When this observation error is considered as simple white Gaussian noise, a result that the posture sensitively responds to a large motion acceleration due to a rapid motion is obtained. Thus, a noise model as with Expression (37) of changing the level depending on the magnitude of the estimated motion acceleration (difference between the observed acceleration and the estimated gravitational acceleration) is used, and a linear coefficient μ [N/A] and a constant term (σ.sub.ζαx, σ.sub.ζαy, σ.sub.ζαz) [G RMS] are used as setting items.
Motion Velocity
[0070] In observing the zero motion velocity, it is observed that the motion velocity of the IMU is substantially zero in a long term. The motion velocity component ζ.sub.v appearing in a short term acts as the observation error. This observation error is considered as the white Gaussian noise independent for each axis, and the magnitude of this observation error is designated by the RMS values (σ.sub.ζvx, σ.sub.ζvy, σ.sub.ζvz) [Gs RMS], as in Expression (38).
1-1-10. Limitation of Yaw-Axis Component of Bias Error in Angular Velocity Sensor
[0071] When an azimuth observation section such as a magnetic sensor is not provided, only a yaw-axis component (vertical component) of the bias error of the angular velocity sensor also monotonously increases. Normally, a yaw-axis direction changes with the posture and the sensor coordinates. Thus, for example, even though a z-axis of the angular velocity sensor at a certain time point coincides with a yaw axis, and thus error estimate increases, correction is performed by observing the gravitational acceleration when the posture changes, and thus the z-axis becomes an inclination (for example, a horizontal-axis direction) other than the yaw axis. Thus, the error estimate is reduced. However, for example, when the posture changes small, and the substantially same posture continues for a long time, the error estimate of the yaw-axis component increases, and the feedback gain increases. In addition, the increase of the feedback gain causes the bias estimation value of the angular velocity sensor to change without intention, and thus causes the azimuth of the posture to drift. In addition, it is unlikely that the bias estimation error of even the yaw-axis component of the practical angular velocity sensor increases beyond, for example, the initial bias error without limit. Accordingly, an upper limit value is provided only in the yaw-axis component of the bias estimation error of the angular velocity sensor in the state error covariance matrix, and thus, the yaw-axis component is limited not to exceed the upper limit value.
[0072] Firstly, as in Expression (39), the variance σ.sub.bωv.sup.2 of the yaw-axis component is calculated from the error covariance matrix Σ.sub.bω.sup.2 of the residual bias in the angular velocity sensor, based on the current posture quaternion.
[0073] When the variance σ.sub.bωv.sup.2 of the yaw-axis component exceeds the upper limit value σ.sub.bωmax.sup.2, the yaw-axis component is limited as in Expression (40).
1-2. Flowchart of Posture Estimation Method
[0074]
[0075] As illustrated in
[0076] For example, the posture estimation device may set the initial posture of the inertial measurement unit (IMU) to have a roll angle, a pitch angle, and a yaw angle which have been determined in advance, and may set q.sub.0 by substituting the roll angle, the pitch angle, and the yaw angle into Expression (31). Alternatively, the posture estimation device may acquire acceleration data from the acceleration sensor in a state where the inertial measurement unit (IMU) is stopped. The posture estimation device may specify a direction of the gravitational acceleration from the acceleration data, calculate a roll angle and a pitch angle, and set the yaw angle to a predetermined value (for example, 0). Then, the posture estimation device may set q.sub.0 by substituting the roll angle, the pitch angle, and the yaw angle into Expression (31). The inertial measurement unit (IMU) sets Σ.sub.q, 0.sup.2 by substituting RMSσ.sub.φ of a roll angle error and RMSσ.sub.θ of a pitch angle error into Expression (32).
[0077] For example, the posture estimation device sets a state where the inertial measurement unit (IMU) is stopped to be an initial state and sets v.sub.0 to 0. Then, the posture estimation device sets Σ.sub.v, 0.sup.2 by substituting RMSσ.sub.vx, RMSσ.sub.vy, and RMSσ.sub.vz of the error of the motion velocity vector v in the axes into Expression (33).
[0078] If the residual bias b.sub.ω of the angular velocity sensor and the residual bias b.sub.α of the acceleration sensor are known, the posture estimation device sets the values of the residual biases to b.sub.ω, 0 and b.sub.α, 0. If b.sub.ω and b.sub.α are not known, the posture estimation device sets b.sub.ω, 0 and b.sub.α, 0 to zero. The posture estimation device sets Σ.sub.bω, 0.sup.2 and Σ.sub.bα, 0.sup.2 by substituting the errors RMSσ.sub.bωx, RMSσ.sub.bωy, and RMSσ.sub.bωz of the residual bias of the angular velocity sensor in the axes and the errors RMSσ.sub.bαx, RMSσ.sub.bαy, and RMSσ.sub.bαz of the residual bias of the acceleration sensor in the axes into Expression (34).
[0079] If the gravitational acceleration value is known, the posture estimation device sets the difference from 1 G to Δg.sub.0. If the gravitational acceleration value is not known, the posture estimation device sets Δg.sub.0 to zero. The posture estimation device sets RMSσ.sub.Δg of the error of the gravitational acceleration value in σ.sub.Δg, 0.sup.2.
[0080] Then, the posture estimation device acquires a measured value of the inertial measurement unit (IMU) (measured-value acquisition step S2). Specifically, the posture estimation device waits until the sampling interval Δt elapses. If the sampling interval Δt elapses, the posture estimation device sets k=k+1 and t.sub.k=t.sub.k-1+Δt and acquires angular velocity data d.sub.ω, k and acceleration data d.sub.α, k from the inertial measurement unit (IMU).
[0081] Then, the posture estimation device performs prediction processing (also referred to as time update processing) of the state vector x.sub.k (including the posture quaternion q.sub.k being the posture information at a time point t.sub.k, as an element) and the error covariance matrix Σ.sub.x, k.sup.2 being error information at the time point t.sub.k (prediction step S3).
[0082]
[0083] Then, the posture estimation device performs processing (posture-change-amount calculation processing) of calculating the posture change amount Δq.sub.k at the time point t.sub.k based on the output of the angular velocity sensor (posture-change-amount calculation step S32). Specifically, the posture estimation device calculates the posture change amount Δq.sub.k with Expression (6), based on the angular velocity data d.sub.ω, k.
[0084] The posture estimation device performs processing (velocity-change-amount calculation processing) of calculating the velocity change amount (C.sub.kλ.sub.k−g.sub.k-1)Δt of the object based on the output of the acceleration sensor and the output of the angular velocity sensor (velocity-change-amount calculation step S33). Specifically, the posture estimation device calculates the velocity change amount (C.sub.kλ.sub.k−g.sub.k-1)Δt with Expressions (9), (10), and (12), based on the acceleration data d.sub.α, k and the angular velocity data d.sub.ω, k.
[0085] The posture estimation device performs processing (posture information prediction processing) of predicting the posture quaternion q.sub.k being posture information of the object at the time point t.sub.k, by using the posture change amount Δq.sub.k (posture information prediction step S34). In the posture information prediction step S34, the posture estimation device further performs processing (velocity information prediction processing) of predicting the motion velocity vector v.sub.k being velocity information of the object at the time point t.sub.k by using the velocity change amount (C.sub.kλ.sub.k−g.sub.k-1)Δt. Specifically, in the posture information prediction step S34, the posture estimation device performs processing of predicting the posture quaternion q.sub.k and the state vector x.sub.k including the motion velocity vector v.sub.k as the element, by Expressions (6), (12), and (19).
[0086] Lastly, the posture estimation device performs processing of updating the error covariance matrix Σ.sub.x, k.sup.2 at the time point t.sub.k by Expressions (20) and (21) (error information update step S35).
[0087] Returning to
[0088] The posture estimation device performs processing (bias error limitation processing) of limiting the bias error component of the angular velocity around the reference vector, in the error covariance matrix Σ.sub.x, k.sup.2 being the error information (bias error limitation step S5). As described above, in the embodiment, the reference vector is the gravitational acceleration vector. Thus, in Step S5, the posture estimation device limits the bias error component of the angular velocity around the gravitational acceleration vector, that is, limits the vertical component (yaw-axis component) of the bias error of the angular velocity.
[0089]
[0090] Then, the posture estimation device performs processing of determining whether or not the variance σ.sub.bωv.sup.2 exceeds the upper limit value (bias error determination step S52).
[0091] When the variance σ.sub.bωv.sup.2 exceeds the upper limit value (Y in Step S52), the posture estimation device performs limitation operation processing of limiting the vertical component of the bias error in the angular velocity (limitation operation step S53). Specifically, when the variance σ.sub.bωv.sup.2 exceeds the upper limit value σ.sub.bωmax.sup.2, the posture estimation device updates the error covariance matrix Σ.sub.x, k.sup.2 by Expression (40). When the variance σ.sub.bωv.sup.2 is equal to or less than the upper limit value (N in Step S52), the posture estimation device does not perform the limitation operation processing in Step S53.
[0092] Returning to
[0093] As illustrated in
[0094] Then, the posture estimation device performs processing (posture information correction processing) of correcting the posture quaternion q.sub.k being the predicted posture information of the object at the time point t.sub.k (posture information correction step S62). Specifically, the posture estimation device performs processing of correcting the state vector x.sub.k at the time point t.sub.k with Expression (27), the observation residual Δz.sub.k, and the Kalman coefficient K.sub.k.
[0095] The posture estimation device performs processing of normalizing the state vector x.sub.k at the time point t.sub.k by Expression (28) (normalization step S63).
[0096] Then, the posture estimation device performs processing of correcting the error covariance matrix Σ.sub.x, k.sup.2 at the time point t.sub.k with Expression (27), the Kalman coefficient K.sub.k, and the transformation matrix H.sub.k (error information correction step S64).
[0097] Returning to
[0098] The order of the steps in
[0099] As described above, according to the posture estimation method in the embodiment, the posture change amount and the velocity change amount of an object are calculated with Expressions (6) and (12) derived from Expressions (1) and (2) of the output model of the IMU. Then, the posture of the object is estimated with the posture change amount and the velocity change amount. In Expressions (6) and (12), a calculation error in the posture change amount or the velocity change amount is reduced in comparison to that in the related art, by calculating the posture change amount and the velocity change amount with not only the first-order term of Δt but also the second-order term and the third-order term thereof.
[0100] If the object rotates, the coordinate transformation matrix C.sub.k changes. However, the coordinate transformation matrix C.sub.k is calculated from the elements of the posture quaternion q.sub.k estimated by the Kalman filter. Thus, when the object rotates rapidly, the coordinate transformation matrix C.sub.k may not immediately follow the rotation of the object. Regarding this, according to the posture estimation method in the embodiment, since the acceleration λ.sub.k is calculated with not only the acceleration but also the angular velocity in Expression (12), the rotation of the object is immediately reflected to the acceleration λ.sub.k. Thus, even when the object rotates rapidly, it is possible to reduce deterioration of calculation accuracy of the velocity change amount.
[0101] Further, according to the posture estimation method in the embodiment, with Expression (23), the observation residual of the gravitational acceleration by the output of the acceleration sensor and the motion velocity of the object become zero in a long term, and the Kalman coefficient K.sub.k in Expression (24) is calculated with the observation residual of the zero motion velocity. Thus, even though observation information of the azimuth is not provided, it is possible to estimate the posture of the object with high accuracy.
[0102] Since information regarding the azimuth of the object is not included in the output of the angular velocity sensor and the output of the acceleration sensor, the azimuth error of the object is not corrected. However, if the azimuth error included in the updated posture error remains, reliability of the azimuth error is monotonously reduced, and the posture error monotonously increases. Thus, the estimation accuracy of the posture may be deteriorated. Regarding this, according to the posture estimation method in the embodiment, the azimuth error included in the posture error of the object, which has been updated by using the output of an angular velocity sensor 12 and the output of an acceleration sensor 14 is removed by Expression (17). Thus, it is possible to reduce a concern of monotonously increasing the posture error and to reduce a concern of decreasing the estimation accuracy of the posture.
[0103] The output of the angular velocity sensor and the output of the acceleration sensor do not include the information regarding the azimuth of the object. Thus, for example, when the object continues in the substantially same posture for a long term, the vertical component of the updated bias error of the angular velocity sensor monotonously increases, and the Kalman coefficient K.sub.k becomes too large. Consequently, there is a concern of deteriorating the estimation accuracy of the posture. Regarding this, according to the posture estimation method in the embodiment, when the vertical component of the bias error in the angular velocity sensor exceeds the upper limit value, the vertical component of the bias error is limited to the upper limit value by Expression (40). Thus, it is possible to reduce a concern of increasing the Kalman coefficient K.sub.k too large and to reduce a concern of decreasing the estimation accuracy of the posture.
[0104] With the above descriptions, according to the posture estimation method in the embodiment, it is possible to reduce a concern of decreasing the estimation accuracy of the posture of the object even when the posture of the object changes small and to estimate the posture of the object with sufficient accuracy.
2. Posture Estimation Device
2-1. Configuration of Posture Estimation Device
[0105]
[0106] As illustrated in
[0107] In the embodiment, the IMU 10 includes the angular velocity sensor 12, the acceleration sensor 14, and a signal processing unit 16. In the embodiment, the IMU 10 may have a configuration obtained by changing or removing some components or by adding another component.
[0108] The angular velocity sensor 12 measures an angular velocity in each of directions of three axes which intersect with each other, in ideal, are perpendicular to each other. The angular velocity sensor 12 outputs an analog signal depending on the magnitude and the orientation of the measured three-axis angular velocity.
[0109] The acceleration sensor 14 measures an acceleration in each of directions of three axes which intersect with each other, in ideal, are perpendicular to each other. The acceleration sensor 14 outputs an analog signal depending on the magnitude and the orientation of the measured three-axis acceleration.
[0110] The signal processing unit 16 performs processing of performing sampling of an output signal of the angular velocity sensor 12 at a predetermined sampling interval Δt to convert the output signal into angular velocity data having a digital value. The signal processing unit 16 performs processing of performing sampling of an output signal of the acceleration sensor 14 at a predetermined sampling interval Δt to convert the output signal into acceleration data having a digital value.
[0111] Ideally, the angular velocity sensor 12 and the acceleration sensor 14 are attached to the IMU 10 such that the three axes coincide with three axes (x-axis, y-axis, and z-axis) of the sensor coordinate system which is an orthogonal coordinate system defined for the IMU 10. However, in practice, an error occurs in a mounting angle. Thus, the signal processing unit 16 performs processing of converting the angular velocity data and the acceleration data into data in an xyz coordinate system, by using a correction parameter which has been, in advance, calculated in accordance with the error in the mounting angle. The signal processing unit 16 also performs processing of correcting the temperature in the angular velocity data and the acceleration data in accordance with temperature characteristics of the angular velocity sensor 12 and the acceleration sensor 14.
[0112] A function of A/D conversion or temperature correction may be embedded in the angular velocity sensor 12 and the acceleration sensor 14.
[0113] The IMU 10 outputs angular velocity data do and acceleration data d.sub.α after the processing by the signal processing unit 16 to the processing unit 20 of the posture estimation device 1.
[0114] The ROM 30 stores programs used when the processing unit 20 performs various types of processing, and various programs or various types of data for realizing application functions, for example.
[0115] The RAM 40 is a storage unit that is used as a work area of the processing unit 20, and temporarily stores a program or data read out from the ROM 30 or operation results obtained by the processing unit 20 performing processing in accordance with various programs, for example.
[0116] The recording medium 50 is a non-volatile storage unit that stores data required to be preserved for a long term among pieces of data generated by processing of the processing unit 20. The recording medium 50 may store programs used when the processing unit 20 performs various types of processing, and various programs or various types of data for realizing application functions, for example.
[0117] The processing unit 20 performs various types of processing in accordance with the program stored in the ROM 30 or the recording medium 50 or in accordance with the program which is received from a server via a network and then is stored in the RAM 40 or the recording medium 50. In particular, in the embodiment, the processing unit 20 executes the program to function as a bias removal unit 22, a posture-change-amount calculation unit 24, a velocity-change-amount calculation unit 26, and a posture estimation unit 28. Thus, the processing unit 20 performs a predetermined operation on the angular velocity data d.sub.ω and the acceleration data d.sub.α output at an interval of Δt by the IMU 10 so as to perform processing of estimating the posture of the object.
[0118] In the embodiment, as illustrated in
[0119] The bias removal unit 22 performs processing of calculating the three-axis angular velocity obtained by removing a bias error from the output of the angular velocity sensor 12 and performs processing of calculating the three-axis acceleration obtained by removing a bias error from the output of the acceleration sensor 14.
[0120] The posture-change-amount calculation unit 24 calculates the posture change amount of the object based on the output of the angular velocity sensor 12. Specifically, the posture-change-amount calculation unit 24 performs processing of calculating the posture change amount of the object by approximation with a polynomial expression in which the sampling interval Δt is used as a variable. The posture-change-amount calculation unit 24 performs the processing with the three-axis angular velocity in which the bias error has been removed by the bias removal unit 22.
[0121] The velocity-change-amount calculation unit 26 calculates the velocity change amount of the object based on the output of the acceleration sensor 14 and the output of the angular velocity sensor 12. Specifically, the velocity-change-amount calculation unit 26 performs processing of calculating the velocity change amount of the object with the three-axis angular velocity and the three-axis acceleration in which the bias error has been removed by the bias removal unit 22.
[0122] The posture estimation unit 28 functions as an integration calculation unit 101, a posture information prediction unit 102, an error information update unit 103, a correction coefficient calculation unit 104, a posture information correction unit 105, a normalization unit 106, an error information correction unit 107, a rotational error-component removal unit 108, and a bias error limitation unit 109. The posture estimation unit 28 performs processing of estimating the posture of the object with the posture change amount calculated by the posture-change-amount calculation unit 24 and the velocity change amount calculated by the velocity-change-amount calculation unit 26. In practice, the posture estimation unit 28 performs processing of estimating a state vector x defined in Expression (18) and an error covariance matrix Σ.sub.x.sup.2 thereof with an extended Kalman filter.
[0123] The integration calculation unit 101 performs integration processing of integrating the posture change amount calculated by the posture-change-amount calculation unit 24 with the previous estimated value of the posture, which has been corrected by the posture information correction unit 105 and normalized by the normalization unit 106. The integration calculation unit 101 performs integration processing of integrating the velocity change amount calculated by the velocity-change-amount calculation unit 26 with the previous estimated value of the velocity, which has been corrected by the posture information correction unit 105 and normalized by the normalization unit 106.
[0124] The posture information prediction unit 102 performs processing of predicting posture quaternion q as posture information of the object, with the posture change amount calculated by the posture-change-amount calculation unit 24. The posture information prediction unit 102 also performs processing of predicting a motion velocity vector v as velocity information of the object, based on the velocity change amount calculated by the velocity-change-amount calculation unit 26. In practice, the posture information prediction unit 102 performs processing of predicting the state vector x including the posture quaternion q and the motion velocity vector v as elements.
[0125] The error information update unit 103 performs processing of updating the error covariance matrix Σ.sub.x.sup.2 as error information, based on the output of the angular velocity sensor 12. Specifically, the error information update unit 103 performs processing of updating a posture error of the object with the three-axis angular velocity in which the bias error has been removed by the bias removal unit 22. In practice, the error information update unit 103 performs processing of updating the error covariance matrix Σ.sub.x.sup.2 with the extended Kalman filter.
[0126] The rotational error-component removal unit 108 performs processing of removing a rotational error component around a reference vector, in the error covariance matrix Σ.sub.x.sup.2 being the error information. Specifically, the rotational error-component removal unit 108 performs processing of removing an azimuth error component included in the posture error in the error covariance matrix Σ.sub.x.sup.2 updated by the error information update unit 103. In practice, the rotational error-component removal unit 108 performs processing of generating the error covariance matrix Σ.sub.x.sup.2 in which the rank limitation and removal of the azimuth error component are performed in the error covariance matrix Σ.sub.q.sup.2 of the posture, on the error covariance matrix Σ.sub.x.sup.2.
[0127] The bias error limitation unit 109 performs processing of limiting a bias error component of an angular velocity around the reference vector, in the error covariance matrix Σ.sub.x.sup.2 being the error information. Specifically, the bias error limitation unit 109 performs processing of limiting a vertical component of the bias error of the angular velocity, in the error covariance matrix Σ.sub.x.sup.2 generated by the rotational error-component removal unit 108. In practice, the bias error limitation unit 109 performs processing as follows. That is, the bias error limitation unit 109 determines whether or not the vertical component of the bias error of the angular velocity exceeds an upper limit value. When the vertical component exceeds the upper limit value, the bias error limitation unit 109 generates the error covariance matrix Σ.sub.x.sup.2 in which limitation is applied such that the vertical component has the upper limit value.
[0128] The correction coefficient calculation unit 104 performs processing of calculating correction coefficients based on the error covariance matrix Σ.sub.x.sup.2 which has been generated by the bias error limitation unit 109 and is the error information. The correction coefficients are used for determining the correction amount of the posture information (posture quaternion q) or the velocity information (motion velocity vector v) of the object by the posture information correction unit 105 and the correction amount of the error information (error covariance matrix Σx) by the error information correction unit 107. In practice, the correction coefficient calculation unit 104 performs processing of calculating an observation residual Δz, a Kalman coefficient K, and a transformation matrix H.
[0129] The posture information correction unit 105 performs processing of correcting the posture information (posture quaternion q) of the object, which has been predicted by the posture information prediction unit 102, based on the error covariance matrix Σ.sub.x being the error information, the gravitational acceleration vector g being the reference vector, and the output of the acceleration sensor 14 being the reference observation sensor. Specifically, the posture information correction unit 105 performs processing of correcting the posture quaternion q with the error covariance matrix Σ.sub.x generated by the bias error limitation unit 109, and the Kalman coefficient K and the observation residual Δz.sub.a of the gravitational acceleration calculated by the correction coefficient calculation unit 104 based on the gravitational acceleration vector g and the acceleration vector α (obtained based on the output of the acceleration sensor 14). In practice, the posture information correction unit 105 performs processing of correcting the state vector x predicted by the posture information prediction unit 102, with the extended Kalman filter.
[0130] The normalization unit 106 performs processing of normalizing the posture information (posture quaternion q) of the object, which has been corrected by the posture information correction unit 105 such that the magnitude thereof does not change. In practice, the normalization unit 106 performs processing of normalizing the state vector x corrected by the posture information correction unit 105.
[0131] The error information correction unit 107 performs processing of correcting the error covariance matrix Σ.sub.x being the error information. Specifically, the error information correction unit 107 performs processing of correcting the error covariance matrix Σ.sub.x generated by the bias error limitation unit 109 with the extended Kalman filter, and the transformation matrix H and the Kalman coefficient K calculated by the correction coefficient calculation unit 104.
[0132] The posture information (posture quaternion q) of the object, which has been estimated by the processing unit 20 can be transmitted to another device via the communication unit 60.
2-2. Configuration of Processing Unit
[0133]
[0134] The bias removal unit 22 calculates the average value of the angular velocity vector ω in the period from the time point t.sub.k-1 to the time point t.sub.k by Expression (19) in a manner of subtracting the residual bias b.sub.ω, k-1 at the time point t.sub.k-1 from the angular velocity data d.sub.ω, k at the time point t.sub.k. The bias removal unit 22 calculates the average value of the acceleration vector α in the period from the time point t.sub.k-1 to the time point t.sub.k by Expression (19) in a manner of subtracting the residual bias b.sub.α, k-1 at the time point t.sub.k-1 from the acceleration data d.sub.α, k at the time point t.sub.k.
[0135] The posture-change-amount calculation unit 24 calculates the approximation of the posture change amount Δq.sub.k at the time point t.sub.k by substituting the average value of the angular velocity vector ω in the period from the time point t.sub.k-1 to the time point t.sub.k and the average value of the angular velocity vector ω in a period from a time point t.sub.k-2 to the time point t.sub.k-1 into the polynomial expression of Expression (6). The above-described average values of the angular velocity vector ω have been calculated by the bias removal unit 22.
[0136] The velocity-change-amount calculation unit 26 calculates a coordinate transformation matrix C.sub.k at the time point t.sub.k from posture quaternion q.sub.k-1 at the time point t.sub.k-1 with Expression (9). The velocity-change-amount calculation unit 26 calculates the approximation of the acceleration λ.sub.k at the time point t.sub.k by substituting the average value of the acceleration vector α and the average value of the angular velocity vector ω in the period from the time point t.sub.k-1 to the time point t.sub.k, and the average value of the acceleration vector α and the average value of the angular velocity vector ω in the period from the time point t.sub.k-2 to the time point t.sub.k-1 into the polynomial expression of Expression (12). The above-described average values of the angular velocity vector α and the acceleration vector α have been calculated by the bias removal unit 22. The velocity-change-amount calculation unit 26 calculates a gravitational acceleration vector g.sub.k-1 at the time point t.sub.k-1 by substituting a gravitational-acceleration correction value Δg.sub.k-1 at the time point t.sub.k-1 into Expression (10). The velocity-change-amount calculation unit 26 calculates the velocity change amount (C.sub.kλ.sub.k−g.sub.k-1)Δt at the time point t.sub.k from the coordinate transformation matrix C.sub.k, the acceleration λ.sub.k, and the gravitational acceleration vector g.sub.k-1, which have been calculated.
[0137] As illustrated in
[0138] The integration calculation unit 101 performs quaternion multiplication of the posture quaternion q.sub.k-1 at the time point t.sub.k-1 and the posture change amount Δq.sub.k at the time point t.sub.k, which has been calculated by the posture-change-amount calculation unit 24, with Expression (6). The integration calculation unit 101 adds a motion velocity vector v.sub.k-1 at the time point t.sub.k-1 and the velocity change amount (C.sub.kλ.sub.k−g.sub.k-1)Δt at the time point t.sub.k, which has been calculated by the velocity-change-amount calculation unit 26, with Expression (12).
[0139] The posture information prediction unit 102 predicts the posture quaternion q.sub.k, the motion velocity vector v.sub.k, the residual bias b.sub.ω, k of the angular velocity sensor 12, the residual bias b.sub.α, k of the acceleration sensor 14, and the gravitational-acceleration correction value Δg.sub.k which are elements of the state vector x.sub.k, with Expression (19). Specifically, the posture information prediction unit 102 predicts the posture quaternion q.sub.k as a result of the quaternion multiplication of the posture quaternion q.sub.k-1 and the posture change amount Δq.sub.k by the integration calculation unit 101. The posture information prediction unit 102 predicts the motion velocity vector v.sub.k as a result of adding the motion velocity vector v.sub.k-1 and the velocity change amount (C.sub.kλ.sub.k−g.sub.k-1)Δt by the integration calculation unit 101. The posture information prediction unit 102 predicts the residual bias b.sub.ω, k of the angular velocity sensor 12 to be the residual bias b.sub.ω, k-1 of the angular velocity sensor 12 at the time point t.sub.k-1. The posture information prediction unit 102 predicts the residual bias b.sub.α, k of the acceleration sensor 14 to be the residual bias b.sub.α, k-1 of the acceleration sensor 14 at the time point t.sub.k-1. The posture information prediction unit 102 predicts the gravitational-acceleration correction value Δg.sub.k to be the gravitational-acceleration correction value Δg.sub.k-1 at the time point t.sub.k-1.
[0140] The error information update unit 103 updates the error covariance matrix Σ.sub.x, k.sup.2 at the time point t.sub.k with Expressions (20) and (21) The error information update unit 103 performs the update with the posture change amount Δq.sub.k calculated by the posture-change-amount calculation unit 24, the acceleration λ.sub.k and the coordinate transformation matrix C.sub.k calculated by the velocity-change-amount calculation unit 26, the posture quaternion q.sub.k-1 at the time point t.sub.k-1, and the error covariance matrix Σ.sub.x, k-1.sup.2 at the time point t.sub.k-1.
[0141] The rotational error-component removal unit 108 calculates a matrix D′.sup.TD′ with the posture quaternion q.sub.k predicted by the posture information prediction unit 102, by Expression (29) The rotational error-component removal unit 108 updates the error covariance matrix Σ.sub.x, k.sup.2 updated by the error information update unit 103, with Expression (28) and the matrix D′.sup.TD′. Thus, the error covariance matrix Σ.sub.x, k.sup.2 in which the rank of the error covariance matrix Σ.sub.q, k.sup.2 of the posture quaternion q.sub.k is limited to 3, and an azimuth error component ε.sub.θ.sub.
[0142] The bias error limitation unit 109 calculates the variance σ.sub.bωv.sup.2 of the vertical component of the bias error in the angular velocity sensor 12 with the posture quaternion q.sub.k predicted by the posture information prediction unit 102 and the error covariance matrix Σ.sub.x, k.sup.2 generated by the rotational error-component removal unit 108, by Expression (39). When the variance σ.sub.bωv.sup.2 exceeds the upper limit value σ.sub.bωmax.sup.2, the bias error limitation unit 109 updates the error covariance matrix Σ.sub.x, k.sup.2 generated by the rotational error-component removal unit 108, by Expression (40). Thus, in the error covariance matrix Σ.sub.x, k.sup.2, the variance σ.sub.bωv.sup.2 of the vertical component of the bias error in the angular velocity sensor 12 is limited to the upper limit value σ.sub.bωmax.sup.2.
[0143] The correction coefficient calculation unit 104 calculates the observation residual Δz.sub.k, the transformation matrix H.sub.k, and the Kalman coefficient K.sub.k at the time point t.sub.k with Expressions (23) to (26). The correction coefficient calculation unit 104 performs the calculation with the error covariance matrix Σ.sub.x, k.sup.2 generated by the bias error limitation unit 109, the coordinate transformation matrix C.sub.k calculated by the velocity-change-amount calculation unit 26, the average value of the acceleration vector α calculated by the bias removal unit 22 in the period from the time point t.sub.k-1 to the time point t.sub.k, and the posture quaternion q.sub.k and the gravitational-acceleration correction value Δg.sub.k predicted by the posture information prediction unit.
[0144] The posture information correction unit 105 corrects the elements (posture quaternion q.sub.k, motion velocity vector v.sub.k, residual bias b.sub.ω, k of the angular velocity sensor 12, residual bias b.sub.α, k of the acceleration sensor 14, and gravitational-acceleration correction value Δg.sub.k) of the state vector x.sub.k predicted by the posture information prediction unit 102. The posture information correction unit 105 performs the correction by Expression (27) with the observation residual Δz.sub.k and the Kalman coefficient K.sub.k calculated by the correction coefficient calculation unit 104.
[0145] The normalization unit 106 normalizes the elements (posture quaternion q.sub.k, motion velocity vector v.sub.k, residual bias b.sub.ω, k of the angular velocity sensor 12, residual bias b.sub.α, k of the acceleration sensor 14, and gravitational-acceleration correction value Δg.sub.k) of the state vector x.sub.k corrected by the posture information correction unit 105, with Expression (28).
[0146] The error information correction unit 107 corrects the error covariance matrix Σ.sub.x, k.sup.2 generated by the bias error limitation unit 109 by Expression (27) with the transformation matrix H.sub.k and the Kalman coefficient K.sub.k calculated by the correction coefficient calculation unit 104.
[0147] If the next sampling interval Δt has elapsed, the state vector x.sub.k calculated by the normalization unit 106 and the error covariance matrix Σ.sub.x, k.sup.2 corrected by the error information correction unit 107 are fed back to the bias removal unit 22, the velocity-change-amount calculation unit 26, the integration calculation unit 101, the posture information prediction unit 102, and the error information update unit 103 in a state of being set to be the state vector x.sub.k-1 and the error covariance matrix Σ.sub.x, k-1.sup.2 at the time point t.sub.k-1.
[0148] The processing unit 20 described above performs posture estimation processing of estimating the posture of the object, for example, in accordance with the procedures illustrated in
[0149] According to the above-described posture estimation device 1 in the embodiment, the processing unit 20 estimates the posture of an object in accordance with the procedures illustrated in
3. Modification Examples
[0150] In the above-described embodiment, the angular velocity sensor and the acceleration sensor are integrated by being accommodated in one inertial measurement unit (IMU). However, the angular velocity sensor and the acceleration sensor may be provided to be individual from each other.
[0151] In the above-described embodiment, the posture estimation device outputs only posture information of the object. However, the posture estimation device may output another kind of information. For example, the posture estimation device may output position information of the object, which is obtained by integrating the velocity information or the motion velocity vector v.sub.k of the object based on the motion velocity vector v.sub.k at the time point t.sub.k.
[0152] In the above-described embodiment, the acceleration sensor and the angular velocity sensor update the outputs at the same sampling interval Δt. However, the sampling interval Δt.sub.a of the acceleration sensor may be different from the sampling interval Δt.sub.ω of the angular velocity sensor. In this case, the prediction processing (time update processing) of the state vector and the error covariance matrix in Step S3 in
[0153] In the above-described embodiment, the posture estimation device estimates the posture of an object by using the gravitational acceleration vector observed by the acceleration sensor as the reference vector, and by using the output of the angular velocity sensor and the reference vector. However, the reference vector may not be the gravitational acceleration vector. For example, when the posture estimation device estimates the posture of an object by using the angular velocity sensor and a geomagnetic sensor, the reference vector may be a geomagnetic vector (vector directed toward the north) observed by using a geomagnetic sensor. For example, when the posture estimation device estimates the posture of a satellite as an object by the angular velocity sensor and a star tracker, the reference vector may be a vector which is directed from the object toward a fixed star and is observed by the star tracker. The acceleration sensor, the geomagnetic sensor, and the star tracker are examples of a reference observation sensor that observes the reference vector.
[0154] In the above-described embodiment, the posture information of an object is expressed in quaternion. However, the posture information may be information expressed by a roll angle, a pitch angle, and a yaw angle or may be a posture transformation matrix.
[0155] In the above-described embodiment, the state vector x.sub.k includes the posture quaternion q.sub.k, the motion velocity vector v.sub.k, the residual bias b.sub.ω, k of the angular velocity sensor, the residual bias b.sub.α, k of the acceleration sensor, and the gravitational-acceleration correction value Δg.sub.k, as the elements. However, the state vector x.sub.k is not limited thereto. For example, the state vector x.sub.k may include the posture quaternion q.sub.k, the residual bias b.sub.ω, k of the angular velocity sensor, the residual bias b.sub.α, k of the acceleration sensor, and the gravitational-acceleration correction value Δg.sub.k, as the elements, and may not include the motion velocity vector v.sub.k.
4. Electronic Device
[0156]
[0157] The communication unit 310 is, for example, a wireless circuit. The communication unit 310 performs processing of receiving data from the outside of the device or transmitting data to the outside thereof, via the antenna 312.
[0158] The posture estimation device 1 performs processing based on the output signal of the inertial measurement unit 10. Specifically, the posture estimation device 1 performs processing of estimating the posture of the electronic device 300 based on an output signal (output data) such as detection data of the inertial measurement unit 10. The posture estimation device 1 may perform signal processing such as correction processing or filtering on the output signal (output data) such as detection data of the inertial measurement unit 10. In addition, based on the output signals, the posture estimation device 1 may perform various types of control processing for the electronic device 300, such as control processing of the electronic device 300 or various types of digital processing of data transmitted or received via the communication unit 310. The function of the posture estimation device 1 can be realized by a processor such as an MPU or a CPU, for example.
[0159] The operation unit 330 is used when a user performs an input operation. The operation unit 330 can be realized by an operation button, a touch panel display, or the like.
[0160] The display unit 340 displays various types of information and can be realized by a display of liquid crystal, organic EL, or the like. The storage unit 350 stores data. The function of the storage unit 350 can be realized by a semiconductor memory such as a RAM or a ROM.
[0161] The electronic device 300 in the embodiment can be applied to, for example, an image-related device (such as a digital still camera or a video camera), an in-vehicle device, a wearable device (such as a head mounted display device or a watch-related device), an ink jet discharge device, a robot, a personal computer, a portable information terminal, a printing device, and a projection device. The in-vehicle device includes a car navigation device or a device for automatic driving, for example. The watch-related device includes a watch or a smart watch, for example. For example, an ink jet printer is provided as the ink jet discharge device. The portable information terminal includes a smart phone, a portable phone, a portable game device, a notebook PC, or a tablet terminal, for example. The electronic device 300 in the embodiment can also be applied to an electronic notebook, an electronic dictionary, a calculator, a word processor, a workstation, a videophone, a television monitor for crime prevention, electronic binoculars, a POS terminal, a medical device, a fish finder, a measuring device, and a device for a base station of a mobile terminal, instruments, a flight simulator, and a network server, for example. The medical device includes an electronic thermometer, a sphygmomanometer, a blood glucose meter, an electrocardiogram measuring device, an ultrasonic diagnostic device, an electronic endoscope, and the like. The instruments are instruments of vehicles, aircraft, ships, and the like.
[0162]
[0163] As illustrated in
[0164] In the display unit 402, position information or the movement amount obtained by a GPS sensor 411 or a geomagnetic sensor 412, motion information (such as a momentum) obtained by the acceleration sensor 14 or the angular velocity sensor 12, biometric information (such as a pulse rate) obtained by a pulse rate sensor 416, and time point information such as the current time point are displayed in accordance with various detection modes. An environmental temperature obtained by a temperature sensor 417 can also be displayed. The communication unit 422 communicates with an information terminal such as a user terminal. The posture estimation device 1 is realized by an MPU, a DSP, and an ASIC, for example. The posture estimation device 1 performs various types of processing based on programs stored in a storage unit 420 and information input by an operation unit 418 such as the operation buttons 406 and 407. The posture estimation device 1 performs processing of estimating posture information of the activity meter 400 based on the output signal of the inertial measurement unit 10. The posture estimation device 1 may perform processing based on output signals of the GPS sensor 411, the geomagnetic sensor 412, a pressure sensor 413, the acceleration sensor 14, the angular velocity sensor 12, the pulse rate sensor 416, the temperature sensor 417, and a timekeeping unit 419. The posture estimation device 1 can also perform display processing of displaying an image in the display unit 402, sound output processing of outputting sound to a sound output unit 421, communication processing of communicating with an information terminal via the communication unit 422, power control processing of supplying power from a battery 423 to the components, and the like.
[0165] According to the activity meter 400 having the above-described configuration in the embodiment, it is possible to exhibit the above-described effects of the posture estimation device 1 and to exhibit high reliability. The activity meter 400 includes the GPS sensor 411 and can measure the movement distance and a movement trajectory of the user. Thus, the activity meter 400 having high usability is obtained. The activity meter 400 can be widely applied to a running watch, a runner watch, an outdoor watch, a GPS watch equipped with a GPS, and the like.
5. Vehicle
[0166] In the embodiment, a vehicle includes the posture estimation device 1 in the above embodiment, and a control device that controls the posture of the vehicle based on the posture information of the vehicle, which has been estimated by the posture estimation device 1.
[0167]
[0168] The positioning device 510 is a device that is mounted on the vehicle 500 and performs positioning of the vehicle 500. The positioning device 510 includes the inertial measurement unit 10, a GPS receiving unit 520, an antenna 522 for GPS reception, and the posture estimation device 1. The posture estimation device 1 includes a position information acquisition unit 532, a position composition unit 534, an operational processing unit 536, and a processing unit 538. The inertial measurement unit 10 includes a three-axis acceleration sensor and a three-axis angular velocity sensor. The operational processing unit 536 receives acceleration data and angular velocity data from the acceleration sensor and the angular velocity sensor, performs inertial navigation operational processing on the received data, and outputs inertial navigation positioning data. The inertial navigation positioning data indicates the acceleration or the posture of the vehicle 500.
[0169] The GPS receiving unit 520 receives a signal from a GPS satellite via the antenna 522. The position information acquisition unit 532 outputs GPS positioning data based on a signal received by the GPS receiving unit 520. The GPS positioning data indicates the position, the speed, and the direction of the vehicle 500 on which the positioning device 510 is mounted. The position composition unit 534 calculates a position at which the vehicle 500 runs on the ground at the current time, based on the inertial navigation positioning data output from the operational processing unit 536 and the GPS positioning data output from the position information acquisition unit 532. For example, if the posture of the vehicle 500 differs by an influence of an inclination (θ) of the ground and the like as illustrated in
[0170] The control device 570 controls the driving mechanism 580, the braking mechanism 582, and the steering mechanism 584 of the vehicle 500. The control device 570 is a controller of controlling the vehicle. For example, the control device 570 can be realized by a plurality of control units. The control device 570 includes a vehicle control unit 572 being a control unit that controls the vehicle, an automatic driving control unit 574 being a control unit that controls automatic driving, and a storage unit 576 realized by a semiconductor memory and the like. A monitoring device 578 monitors an object such as an obstacle around the vehicle 500. The monitoring device 578 is realized by a surrounding monitoring camera, a millimeter wave radar, a sonar or the like.
[0171] As illustrated in
[0172] In the embodiment, the control device 570 controls at least one of accelerating, braking, and steering of the vehicle 500 based on information of the position and the posture of the vehicle 500, which has been obtained by the posture estimation device 1. For example, the control device 570 controls at least one of the driving mechanism 580, the braking mechanism 582, and the steering mechanism 584 based on the information of the position and the posture of the vehicle 500. Thus, for example, it is possible to realize automatic driving control of the vehicle 500 by the automatic driving control unit 574. In the automatic driving control, a monitoring result of a surrounding object by the monitoring device 578, map information or running route information stored in the storage unit 576, and the like are used in addition to the information of the position and the posture of the vehicle 500. The control device 570 switches the execution or non-execution of the automatic driving of the vehicle 500 based on a monitoring result of the output signal of the inertial measurement unit 10. For example, the posture estimation device 1 monitors the output signal such as detection data from the inertial measurement unit 10. When a decrease of detection accuracy of the inertial measurement unit 10 or a sensing problem is detected based on the monitoring result, for example, the control device 570 performs switching from the execution of the automatic driving to the non-execution of the automatic driving. For example, in the automatic driving, at least one of accelerating, braking, and steering of the vehicle 500 is automatically controlled. When the automatic driving is not executed, such automatic control of accelerating, braking, and steering is not performed. In this manner, an assistance having higher reliability in running of the vehicle 500 that performs automatic driving is possible. An automation level of the automatic driving may be switched based on the monitoring result of the output signal of the inertial measurement unit 10.
[0173]
[0174] As illustrated in
[0175] The work mechanism 620 includes a boom 613, an arm 614, a bucket link 616, a bucket 615, a boom cylinder 617, an arm cylinder 618, and a bucket cylinder 619, as the plurality of members. The boom 613 is attached to the front portion of the upper revolving body 611 to be capable of elevating. The arm 614 is attached to the tip of the boom 613 to be capable of elevating. The bucket link 616 is attached to the tip of the arm 614 to be rotatable. The bucket 615 is attached to the tip of the arm 614 and the bucket link 616 to be rotatable. The boom cylinder 617 drives the boom 613. The arm cylinder 618 drives the arm 614. The bucket cylinder 619 drives the bucket 615 through the bucket link 616.
[0176] The base end of the boom 613 is supported by the upper revolving body 611 to be rotatable in the up-and-down direction. The boom 613 is rotationally driven relative to the upper revolving body 611 by expansion and contraction of the boom cylinder 617. An inertial measurement unit 10c functioning as an inertial sensor that detects the motion state of the boom 613 is disposed in the boom 613.
[0177] One end of the arm 614 is supported by the tip of the boom 613 to be rotatable. The arm 614 is rotationally driven relative to the boom 613 by expansion and contraction of the arm cylinder 618. An inertial measurement unit 10b functioning as an inertial sensor that detects the motion state of the arm 614 is disposed in the arm 614.
[0178] The bucket link 616 and the bucket 615 are supported by the tip of the arm 614 to be rotatable. The bucket link 616 is rotationally driven relative to the arm 614 by expansion and contraction of the bucket cylinder 619. The bucket 615 is rotationally driven relative to the arm 614 with the bucket link 616 driven. An inertial measurement unit 10a functioning as an inertial sensor that detects the motion state of the bucket link 616 is disposed in the bucket link 616.
[0179] Here, the inertial measurement unit 10 described in the above embodiment can be used as the inertial measurement units 10a, 10b, 10c, and 10d. The inertial measurement units 10a, 10b, 10c, and 10d can detect at least any of an angular velocity and an acceleration acting on the members of the work mechanism 620 or the upper revolving body 611. As illustrated in
[0180] Further, as illustrated in
[0181] As the construction machine in which the posture estimation device 1 in the above embodiment is used, for example, a rough terrain crane (crane car), a bulldozer, an excavator/loader, a wheel loader, and an aerial work vehicle (lift car) is provided in addition to the hydraulic shovel (jumbo, back hoe, and power shovel) exemplified above.
[0182] According to the embodiment, with the posture estimation device 1, it is possible to obtain information of a posture with high accuracy. Thus, it is possible to realize appropriate posture control of the vehicle 600. According to the vehicle 600, since the compact inertial measurement unit 10 is mounted, it is possible to provide a construction machine in which a plurality of inertial measurement units can be compactly disposed at installation sites of the inertial measurement units 10 by serial coupling (multi-coupling), or cable routing of coupling the inertial measurement units 10 installed at the sites to each other in series by a cable can be compactly performed, even in a very narrow region such as the bucket link 616.
[0183] In the embodiment, descriptions are made by using the four-wheel vehicle such as the agricultural machine and the construction machine as an example of the vehicle in which the posture estimation device 1 is used. However, in addition, motorcycles, bicycles, trains, airplanes, biped robots, remote-controlled or autonomous aircraft (such as radio-controlled aircraft, radio-controlled helicopters and drones), rockets, satellites, ships, automated guided vehicles (AGVs) are provided.
[0184] The present disclosure is not limited to the embodiment, and various modifications can be made in a range of the gist of the present disclosure.
[0185] The above-described embodiment and modification examples are examples, and the present disclosure is not limited thereto. For example, the embodiment and the modification examples can be appropriately combined.
[0186] The present disclosure includes a configuration which is substantially the same as the configuration described in the embodiment (for example, configuration having the same function, method, and result or configuration having the same purpose and effect). The present disclosure includes a configuration in which the not-essential component in the configuration described in the embodiment has been replaced. The present disclosure includes a configuration of exhibiting the same effects as those in the configuration described in the embodiment or a configuration capable of achieving the same purpose. The present disclosure includes a configuration in which a known technique is added to the configuration described in the embodiment.