Real-time three-dimensional map building method and device using three-dimensional lidar
11525923 · 2022-12-13
Assignee
Inventors
Cpc classification
G01S17/894
PHYSICS
G06T19/00
PHYSICS
G01S7/4804
PHYSICS
International classification
G01S7/481
PHYSICS
G01S17/894
PHYSICS
Abstract
Real-time three-dimensional (3D) map building method and device using a 3D lidar includes representing 3D map data of a surrounding environment acquired by using a 3D lidar attached to a moving object as voxels, acquiring an eigenvalue and an eigenvector for each voxel based on all 3D points in a 3D map represented as the voxels, detecting a 3D corresponding point in the voxel corresponding to all the 3D points of 3D data newly acquired by using the 3D lidar while the moving object travels, calculating a rotation transformation and a translation transformation for minimizing an error by minimizing an inner product value between the eigenvector weighted by the eigenvalue of the voxel to which the 3D corresponding point belongs and a vector generated from a 3D corresponding point, and updating the 3D map data based on the rotation transformation and the translation transformation.
Claims
1. A real-time three-dimensional map building method using a three-dimensional lidar, comprising: representing three-dimensional map data of a surrounding environment acquired by using a three-dimensional lidar attached to a moving object as voxels; acquiring an eigenvalue and an eigenvector for each voxel based on all three-dimensional points in a three-dimensional map represented as the voxels; detecting three-dimensional corresponding points in the voxels corresponding to all the three-dimensional points newly acquired by using the three-dimensional lidar while the moving object travels; calculating a rotation transformation and a translation transformation for minimizing an error by minimizing an inner product value between the eigenvector weighted by the eigenvalue of the voxel to which the three-dimensional corresponding point belongs and a vector generated from a three-dimensional corresponding point; and updating the three-dimensional map data based on the rotation transformation and the translation transformation, wherein the rotation transformation and the translation transformation are acquired by an equation below,
2. The real-time three-dimensional map building method of claim 1, wherein the updating of the three-dimensional map data includes acquiring new eigenvalue and eigenvector if the three-dimensional point transformed based on the rotation transformation and the translation transformation is a new region, and updating the eigenvalue and eigenvector of the voxel if the three-dimensional point belongs to the voxel in the same space as the three-dimensional corresponding point.
3. The real-time three-dimensional map building method of claim 1, wherein the calculating of the rotation transformation and the translation transformation includes determining that the error is minimum if differences between the rotation transformation and translation transformation which are updated and the rotation transformation and translation transformation which are calculated immediately before are less than a predetermined threshold.
4. A real-time three-dimensional map building device using a three-dimensional lidar, comprising: a voxel representation unit that represents three-dimensional map data of a surrounding environment acquired by using a three-dimensional lidar attached to a moving object as voxels; an eigenvalue and eigenvector acquisition unit that acquires an eigenvalue and an eigenvector for each voxel based on all three-dimensional points in a three-dimensional map represented as the voxels; a corresponding point detection unit that detects a three-dimensional corresponding point in the voxel corresponding to all the three-dimensional points of three-dimensional data newly acquired by using the three-dimensional lidar while the moving object travels; a transformation calculation unit that calculates a rotation transformation and a translation transformation for minimizing an error by minimizing an inner product value between the eigenvector weighted by the eigenvalue of the voxel to which the three-dimensional corresponding point belongs and a vector generated from a three-dimensional corresponding point; and a map updating unit that updates the three-dimensional map data based on the rotation transformation and the translation transformation, wherein the rotation transformation and the translation transformation are acquired by an equation below,
5. The real-time three-dimensional map building device of claim 4, wherein the map updating unit acquires new eigenvalue and eigenvector if the three-dimensional point transformed based on the rotation transformation and the translation transformation is a new region, and updates the eigenvalue and eigenvector of the voxel if the three-dimensional point belongs to the voxel in the same, space as the three-dimensional corresponding point.
6. The real-time three-dimensional map building device of claim 4, wherein the transformation calculation unit determines that the error is minimum if differences between the rotation transformation and translation transformation which are updated and the rotation transformation and translation transformation which are calculated immediately before are less than a predetermined threshold.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1)
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
DETAILED DESCRIPTION OF THE INVENTION
(12) Since the present invention may be variously changed and have several embodiments, specific embodiments will be illustrated in the drawings and described in detail in the specification. However, this is not intended to limit the present invention to the specific embodiments, and it should be understood that the present invention includes all modifications, equivalents, and substitutes included in the spirit and scope of the present invention. In describing the drawings, similar reference numerals are used for similar configuration elements.
(13) Terms such as first, second, A, and B may be used to describe various configuration elements, but the configuration elements should not be limited by the terms. The terms are used only for the purpose of distinguishing one configuration element from another configuration element. For example, without departing from the scope of the present invention, the first configuration element may be referred to as the second configuration element, and similarly, the second configuration element may also be referred to as the first configuration element. A term and/or includes a combination of a plurality of related items or any item of the plurality of related items.
(14) When one configuration element is referred to as being “connected” or “coupled” to another configuration element, the one configuration element may be directly connected to or coupled to the other configuration element, it should be understood that still another configuration element may exist therebetween. Meanwhile, when one configuration element is referred to as being “directly connected” or “directly coupled” to another configuration element, it should be understood that there is no other configuration elements therebetween.
(15) The terminology used in the present application is for the purpose of describing the specific embodiments only and is not intended to limit the present invention. A singular expression includes plural expressions unless the context clearly indicates otherwise. In the present application, it should be understood that the terms “include”, “have”, and the like are intended to designate existence of a feature, a number, a step, an operation, a configuration element, a component, or a combination thereof described in the specification, and a possibility of existence or addition of one or more other features, a number, a step, an operation, a configuration element, a component, or a combination thereof is not excluded in advance.
(16) Unless defined otherwise, all terms which are used herein and include technical or scientific terms have the same meaning as commonly understood by those skilled in the art to which the present belongs. Terms such as terms defined in the commonly used dictionary should be construed as having meanings consistent with the meanings in the context of a related art and shall not be construed as ideal or excessively formal meanings unless expressly defined in the present application.
(17) Three-dimensional mapping is becoming an important axis in the fourth Industrial Revolution together with accurate three-dimensional mapping for an autonomous vehicle, three-dimensional mapping for an autonomous flying drone, and a virtual reality (VR) service for indoor and outdoor spaces. Currently, the technology is largely divided into two types, and one of which is a mobile mapping system (MMS) that matches three-dimensional lidar (3D LiDAR) based on information on a position and a posture acquired through fusion of several sensors such as real-time kinematic (RTK), inertial measurement unit (IMU), and distance measurement indicator (DMI).
(18) The other is a method for estimate the position and posture by using an algorithm such as scan matching with data acquired from the three-dimensional lidar and for using sensor data of the RTK and IMU as auxiliary data. One of advantages of the second method is that the three-dimensional mapping may be made even for an indoor space without a GPS signal. The present invention relates to a matching method based on a scan matching method of the three-dimensional lidar, which is a second method.
(19) Meanwhile, a go-stop motion method is used for a traditional method of creating a map by using the three-dimensional lidar, and the method has a problem that, if the lidar moves during three-dimensional scan, the movement causes a distortion in measured data and an error to be large, and thereby, accuracy decreases, and thus, mapping takes a lot of time because the data is scanned while repeating a process of moving a sensor to a next position after acquiring a distance image in a stationary state.
(20) In order to solve the problems, the present invention uses a unique continuous motion method, thereby, generating an accurate three-dimensional map by estimating a translation and a rotation transformation in real time even if the three-dimensional lidar moves. Therefore, the present invention may be applied to a dynamic robot, a drone, an autonomous vehicle, and the like.
(21) Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings.
(22)
(23) Referring to
(24) Further, referring to
(25) First, the three-dimensional map data of a surrounding environment acquired by using a three-dimensional lidar (for example, a Velodyne lidar) is represented as voxels (S100), and an eigenvalue λ and an eigenvector ν are calculated based on all three-dimensional points in each voxel (S200).
(26)
(27) Next, a three-dimensional corresponding point in the voxel corresponding to all three-dimensional points of the three-dimensional data acquired by newly scanning may be detected by using the three-dimensional lidar while a moving object (for example, an autonomous vehicle or a drone or the like) travels (S300). The Euclidean distance may be used for the corresponding point detection like an iterative closest points (ICP) algorithm, which is not limited to the present embodiment. Since the three-dimensional corresponding point detection is a known method, description thereof will be omitted.
(28) At this time, an error calculated when there is a corresponding point q.sub.k after a point p.sub.i is transformed by a rotation transformation R and a translation transformation t is represented by Equation 1.
(29)
(30) Next, for all three-dimensional points p; of the newly acquired data, the rotation transformation R and the translation transformation t for minimizing an error according to Equation 1 are acquired as follows (S400). That is, the rotation transformation and the translational transformation for minimizing the error is acquired by minimizing an inner product between the eigenvector weighted with the eigenvalue of the voxel to which the three-dimensional corresponding point belongs and a vector generated from the three-dimensional corresponding point.
(31) Meanwhile, it is assumed that there are a plane, a straight line, and a cylindrical figure in order to explain Equation 2.
(32)
(33) A corresponding point of the three-dimensional map data closest in Euclidean distance to the newly acquired point using the three-dimensional lidar is found. At this time, a vector P made by two three-dimensional points p; and q.sub.k are calculated by using a component value of the eigenvector. That is, an error of each eigenvector direct is calculated. When the acquired eigenvalues are referred to as λ1, λ2, and λ3, a relationship between the eigenvalues is as follows.
(34) λ1≤λ2≤λ3
(35) The eigenvectors corresponding to the respective eigenvalues are ν1, ν2, and ν3.
(36) At this time, for example, if differences between the rotation transformation and translation transformation which are updated by applying the Levenberg-Marquardt algorithm and the rotation transformation and translation transformation which are calculated immediately before are less than a predetermined threshold, an error is determined to be minimum, the algorithm ends, and the processing proceeds to the next step (S500).
(37) Next, the three-dimensional map data is updated based on the rotation transformation and the translation transformation calculated above (S500). At this time, if the three-dimensional point transformed based on the rotation transformation and the translation transformation is a new region, new eigenvalue and eigenvector are acquired. If the three-dimensional point belongs to the voxel in the same space as the three-dimensional corresponding point, the eigenvalue and eigenvector of the voxel may be updated.
(38)
(39) Referring to
(40)
(41) If only a point cloud for a straight line exists in any one voxel and an eigenvalue thereof is acquired, two eigenvalues may be small and the other eigenvalue may be relatively large. Therefore, an eigenvector direction error value of each of the two vectors may have a shape illustrated in the graph of
(42)
(43) If the eigenvalues for any one voxel are similar, a cylinder may be assumed. In this case, each eigenvector direction error may be affected by all direction components to some extent as in the plane of
(44)
(45)
(46) In the test, the exact rotation transformation and the translation transformation are made by using a value confirmed by a human eye while changing the values little by little as a true value, a graph of (a) of
(47) A main reason for this tendency of the results of ICP is that in the three-dimensional map data with a low density, the Euclidean distance between two different data is reduced for an empty space.
(48)
(49) In
(50)
(51) Referring to
(52)
(53) Hereinafter, an algorithm for creating the three-dimensional map according to the embodiment of the present invention will be described in more detail.
(54) As described above, a plurality of voxels V.sub.a, V.sub.b, and V.sub.c are used to perform an algorithm for creating the three-dimensional map according to the embodiment of the present invention. A subscript in the voxel V represents a size of the voxel and has a relation of a>b>c. The centers c.sub.x, c.sub.y, and c.sub.z of the voxels represent average values of all three-dimensional points included in the voxels.
(55) Further, upper voxels V.sub.a and V.sub.b are made by using the center of the voxel V.sub.0. The top voxel V.sub.a is made for real-time performance. This is because when the three-dimensional map is created, the number of data stored in a computer memory increases considerably according to a travel distance, and thus, the real-time performance may not be guaranteed. Therefore, the data required to perform the current algorithm is selected through the voxel V.sub.b extracted from the upper voxel V.sub.a. The three-dimensional map data M.sub.t-1 includes all three-dimensional data generated up to the previous time t−1 and V.sub.b is generated from a subset m.sub.t-1CM.sub.t-1. Elements included in the voxel V.sub.b are a center, an eigenvalue, and an eigenvector of the voxel, and represented as follows.
V.sub.b=[c.sub.xc.sub.yc.sub.zλ.sub.1λ.sub.2λ.sub.3ν.sub.1.sup.Tν.sub.2.sup.Tν.sub.3.sup.T].sup.T
(56) An element q of a voxel map m.sub.t-1 is a 3×1 vector q∈m.sub.t-1 having the center of the voxel
(57) Next, in order to acquire a correspondence relation between the two sets m.sub.t-1 and Z.sub.t, a method of finding a correspondence relation in the ICP may be used. In order to find a more accurate correspondence relation, element centers, eigenvalues, and eigenvectors of two sets are all compared to each other, but in order to ensure the real-time performance, only the centers are compared to each other, and it is possible to assume that the two three-dimensional points P.sub.i and q.sub.k with the smallest Euclidean distance therebetween correspond to each other. At this time, above Equation 1 is calculated by using the eigenvalue and eigenvector in a voxel having the two three-dimensional points P.sub.i and q.sub.k. KD-Tree may be used to find the fastest correspondence relation.
(58) Now, the rotation transformation and the translation transformation that has to be calculated to build the three-dimensional map may be applied to the set Z.sub.t to finally acquire the updated maps m.sub.t and M.sub.t from the three-dimensional map m.sub.t-1. To this end, Equation 2 may be calculated for all data for which the correspondence relation is found. Since Equation 2 may not be calculated directly, the Levenberg-Marquardt algorithm may be applied among optimization methods.
(59)
(60) Referring to
(61) Hereinafter, a three-dimensional map building device according to another embodiment of the present invention will be described.
(62)
(63) Referring to
(64) Further, each configuration of the three-dimensional map building device 100 according to another embodiment of the present invention may be described with reference to
(65) The voxel representation unit 110 may represent the three-dimensional map data 20 for a surrounding environment acquired by using the three-dimensional lidar 10 as a voxel.
(66) The eigenvalue and eigenvector acquisition unit 120 may calculate an eigenvalue λ and an eigenvector ν of each voxel based on all three-dimensional points in each voxel.
(67) The corresponding point detection unit 130 may detect three-dimensional corresponding points in the voxels corresponding to all the three-dimensional points of the three-dimensional data acquired by newly scanning by using the three-dimensional lidar 10 while the moving object travels. At this time, the Euclidean distance may be used for a corresponding point detection like the iterative closest points (ICP) algorithm.
(68) The transformation calculation unit 140 may acquire rotation transformation and translation transformation for all the three-dimensional points of the newly acquired data according to Equation 1. That is, the rotation transformation and the translation transformation for minimizing an error are acquired by minimizing an inner product value between the eigenvector weighted with the eigenvalue of the voxel to which the three-dimensional corresponding point belongs and a vector generated from a three-dimensional corresponding point.
(69) At this time, for example, if differences between the rotation transformation and translation transformation which are updated by applying the Levenberg-Marquardt algorithm and the rotation transformation and translation transformation which are calculated immediately before are less than a predetermined threshold, the error is determined to be minimum and thus, the algorithm may end.
(70) The map updating unit 150 updates the three-dimensional map data based on the rotation transformation and the translation transformation calculated by the transformation calculation unit 140. At this time, if the three-dimensional point transformed based on the rotation transformation and the translation transformation is a new region, new eigenvalue and eigenvector are acquired, and if the three-dimensional point belongs to a voxel in the same space as the three-dimensional corresponding point, the eigenvalue and eigenvector of the voxel may be updated.
(71) Although above description is made with reference to preferred embodiments of the present invention, it will be understood that those skilled in the art may variously modify and change the present invention without departing from the spirit and scope of the present invention described in claims below.