METHOD FOR GENERATING REAL-TIME RELATIVE MAP, INTELLIGENT DRIVING DEVICE, AND COMPUTER STORAGE MEDIUM
20230070760 · 2023-03-09
Inventors
Cpc classification
G01C21/3848
PHYSICS
G06V20/588
PHYSICS
International classification
Abstract
Disclosed are a method for generating a real-time relative map, an intelligent driving device, and a storage medium. The method includes following steps: determining a number of collected lane guiding lines based on GPS trajectory data and included in a lane guiding line list; if the lane guiding line list includes a plurality of collected lane guiding lines, sorting the plurality of collected lane guiding lines according to coordinate values, which are on a coordinate axis perpendicular to a traveling direction of a vehicle, of points having a same serial number on each of the collected guiding lines, and generating a real-time relative map based on at least the sorted collected lane guiding lines and a lane width.
Claims
1. A method for generating a real-time relative map, comprising: determining a number of collected lane guiding lines based on GPS trajectory data and comprised in a lane guiding line list; if the lane guiding line list comprises a plurality of collected lane guiding lines, sorting the plurality of collected lane guiding lines based on coordinate values, which are on a coordinate axis perpendicular to a traveling direction of a vehicle, of points having a same serial number on each of the collected guiding lines, and generating the real-time relative map based on at least the sorted collected lane guiding lines and a lane width; if there is only one collected lane guiding line in the lane guiding line list, developing one or more derivative lane guiding lines based on the one collected lane guiding line and the lane width, and generating the real-time relative map based on at least one or more derivative lane guiding lines; and if there is no collected lane guiding line in the lane guiding line list, generating the real-time relative map based on lane guiding lines, which are generated according to lane lines obtained by a vehicle sensor.
2. The method of claim 1, further comprising: determining a projection point of a vehicle on each lane guiding line, and generating the real-time relative map according to one lane guiding line, which takes the projection point as a starting point and extends along a traveling direction of the vehicle.
3. The method of claim 1, further comprising: determining whether adjacent lane lines overlap; if the adjacent lane lines do not overlap, determining a shared lane line according to one of ways of: taking a central line between the adjacent lane lines as the shared lane line; taking one of the adjacent lane lines proximate to a vehicle as the shared lane line; and taking one of the adjacent lane lines away from the vehicle as the shared lane line.
4. The method of claim 1, further comprising: based on conditions of GPS signals and a confidence level of lane lines obtained by a vehicle sensor, generating a merged lane guiding line according to the collected lane guiding lines, or derivative lane guiding lines, or lane guiding lines generated according to lane lines obtained by a vehicle sensor, and generating the real-time relative map based on the merged lane guiding line.
5. The method of claim 1, further comprising: determining whether there is a gap between a front and a rear collected or derivative lane guiding lines on a travel path of a vehicle; if there is a gap, determining whether there is a point pair, points of which are located on a front collected or derived lane guiding line and a rear collected or derived lane guiding line respectively, and have a distance therebetween less than a preset threshold; and if there are such point pairs, selecting one point pair, points of which have a minimum distance from each other, and when the vehicle reaches one point of the one point pair located on the front lane guiding line, generating the real-time relative map in subsequent based on the rear lane guiding line.
6. The method of claim 4, further comprising: adding restrictions corresponding to traffic rules to the real-time relative map.
7. The method of claim 1, wherein before the determining the number of collected lane guiding lines based on the GPS trajectory data and comprised in the lane guiding line list, the method further comprises: recording and collecting lane guiding line data via GPS; establishing a lane guiding line list on the base of a tuple of each lane guiding line; and storing data of one or more recorded lane guiding lines based on GPS in the lane guiding line list.
8. The method of claim 7, wherein the storing data of one or more recorded lane guiding lines based on GPS in the lane guiding line list comprises: storing the lane guiding line data in a format of a Front-Left-Up (FLU) coordinate system with respect to a vehicle body, wherein the FLU coordinate system takes a center of a rear axle of the vehicle as an origin of coordinates, takes a traveling direction of the vehicle as an X axis, and takes a direction perpendicular to the traveling direction of the vehicle as a Y axis; each lane guiding line is composed of a series of points, and each point is described by a tuple(x, y, s, θ, κ, κ′), wherein (x, y) denote the coordinates of a point on the lane guiding line in the FLU coordinate system; s represents a length of a travel path from a starting point (0, 0) in the FLU coordinate system to the point (x, y) on the lane guiding line; θ represents an angle between an orientation of the point (x, y) on the lane guiding line and the X axis of the starting point (0, 0) in the FLU coordinate system or a heading of the point (x, y) on the lane guiding line; κ and κ′ represent curvature and a first-order derivative of the point (x, y) on the lane guiding line, respectively.
9. The method of claim 8, wherein before the storing the lane guiding line data in a format of a Front-Left-Up (FLU) coordinate system with respect to a vehicle body, the method further comprises: presented the collected data by an East-North-Up (ENU) coordinate system; and (0011) converting the collected data from the ENU coordinates into the FLU coordinates.
10. The method of claim 9, wherein the converting the collected data from the ENU coordinates into the FLU coordinates is executed by formulas of:
x.sub.flu=(x.sub.enu−x.sub.ini)cos θ.sub.ini+(y.sub.enu−y.sub.ini)sin θ.sub.ini
y.sub.flu=(y.sub.enu−y.sub.ini)cos θ.sub.ini−(x.sub.enu−x.sub.ini)sin θ.sub.ini
θ.sub.flu=θ.sub.enu−θ.sub.ini wherein position coordinates of the vehicle in the ENU coordinate system are (x.sub.ini, y.sub.ini); the position of the vehicle has a heading θ.sub.ini; and coordinates of any point on the guiding line in the ENU coordinate system are (x.sub.enu, y.sub.enu); and θ.sub.enu denotes a heading of the point in the ENU coordinate system; (x.sub.flu, y.sub.flu) denote position coordinates of the point in the FLU coordinate system; and θ.sub.flu denotes the heading of the point in the FLU coordinate system; and 0≤θ.sub.flu≤2π.
11. A computer storage medium on which a computer program is stored, wherein the method of claim 1 is implemented when the computer program is executed by a processor.
12. An intelligent driving device, comprising: a processor; a storage and a network interface coupled with the processor, respectively; a GPS unit configured to obtain location information of the intelligent driving device; and a vehicle sensor unit configured to collect data of lane data of the intelligent driving device; wherein the processor is configured to perform the method of claim 1.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0021] The embodiments of the present application will be further described in detail with reference to the accompanying drawings hereinafter, where:
[0022]
[0023]
[0024]
[0025]
[0026]
[0027]
[0028]
DETAILED DESCRIPTION
[0029] In order to make the objectives, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be described clearly and completely combining the drawings in the embodiments of the present application. Obviously, the described embodiments are only a part of the embodiments of the present application, but not all of the embodiments. Based on the embodiments of this application, all other embodiments obtained by those of ordinary skill in the art without creative work shall fall within the protection scope of this application.
[0030] In the following detailed description, reference may be made to the drawings of the specification, which are part of this application, to illustrate specific embodiments of this application. In the accompany drawings, similar reference numerals in different drawings describe substantially similar components. Each specific embodiment of the present application is described sufficiently and in detail herein, so that a person of ordinary skill having relevant knowledge and technology in the art can implement the technical solutions of the present application. It should be understood that structural, logical or electrical changes may be made on the base of other embodiments or made for the embodiments of the present application.
[0031] In order to make the objectives, technical solutions, and advantages of the present application clearer, the present application will be further described in detail combining the accompanying drawings and the embodiments. It should be understood that the specific embodiments described herein are only used to illustrate the present application, but not intended to limit the present application.
[0032] One of the bases for generating a real-time relative map is a lane guiding line.
[0033] There are many data sources for generating the lane guiding line. One source is GPS trajectory data that have been recorded and collected in advance, and GPS positioning is necessary during generation of the lane guiding line by using these data. Another source is lane data obtained in real time by the vehicle sensor.
[0034] First, a method of recording and collecting lane guiding line data via GPS is described.
[0035] The lane guiding line data are stored in a format of a Front-Left-Up (FLU) coordinate system with respect to a vehicle body. The FLU coordinate system takes the center of the rear axle of the vehicle as the origin of coordinates, takes the traveling direction of the vehicle as the X axis, and takes the direction perpendicular to the traveling direction of the vehicle as the Y axis. Each lane guiding line may be composed of a series of points, and each point can be described by a tuple(x, y, s, θ, κ, κ′), where (x, y) denote the coordinates of a point on the lane guiding line in the FLU coordinate system.
[0036] If the collected data are presented by the East-North-Up (ENU) coordinate system, then the collected data need to be converted from the ENU coordinates into the FLU coordinates.
[0037]
x.sub.flu=(x.sub.enu−x.sub.ini)cos θ.sub.ini+(y.sub.enu−y.sub.ini)sin θ.sub.ini
y.sub.flu=(y.sub.enu−y.sub.ini)cos θ.sub.ini−(x.sub.enu−x.sub.ini)sin θ.sub.ini (1)
θ.sub.flu=θ.sub.enu−θ.sub.ini (2)
[0038] An s in the tuple of a point on the lane guiding line represents a length of the travel path from a starting point (0, 0) in the FLU coordinate system, namely the position of the vehicle, to a current point (x, y) on the lane guiding line.
[0039] A θ in the tuple of the point on the lane guiding line represents the angle between an orientation of the current point (x, y) on the lane guiding line and the X axis of the starting point (0, 0) in the FLU coordinate system, and the angle is also known as a heading. The heading in the ENU coordinate system may be converted into the heading in the FLU coordinate system by means of the formula (2).
[0040] An κ and κ′ in the tuple of the point on the lane guiding line represent curvature and a first-order derivative of the current point (x, y) on the lane guiding line, respectively.
[0041] How to obtain the tuple values of each point on the lane guiding line will be described in detail as follows.
[0042]
θ(l)=θ.sub.0+∫.sub.0.sup.lκ(t)dt
[0043] In addition, the coordinates (x, y) of the current point on the lane guiding line in the FLU coordinate system may be computed from formulas (4):
x=x.sub.0+∫.sub.0.sup.l cos θ(t)dt
y=y.sub.0+∫.sub.0.sup.l sin θ(t)dt (4)
[0044] Assuming that the X-axis coordinate of the initial point of the lane guiding line is consistent with the X-axis coordinate of initial position (0,0) of the vehicle, then x.sub.0=0, and it may be known that, within a shorter curve length (such as l≤1m), the change rate of the heading of the guiding line is relatively small, thus cos θ (t)≈1, and sin θ (t)≈θ (t), the formulas (4) can be simplified as
[0045] Where, y.sub.0 denotes an offset of the initial point of the guiding line deviating from the initial position (0, 0) of the vehicle along the Y-axis. If the vehicle travels along the guiding line, then y.sub.0=0, and l=s; otherwise, the offset y.sub.0 of the initial point of the guiding line deviating from the initial position of the vehicle along the Y-axis can be obtained by means of sensor data. θ.sub.0 denotes the angle between the X axis of the initial position of the vehicle and the heading of the initial point of the guiding line, and may be obtained by means of sensor data, or directly configured to be θ.sub.0=0.
[0046] The curvature κ and the first-order derivative κ′ of different positions on the guiding line may be computed from formulas (6), where ε=0.0001:
[0047] Where, κ.sub.0 and κ.sub.0′ denote the curvature and the first-order derivative of the initial point on the guiding line respectively, and generally have a value 0.
[0048] According to the formulas (5), the calculation formula of the heading θ of the current point on the guiding line may also be expressed as:
[0049] In addition, the travel path s from the vehicle to the current point (x, y) on the guiding line may be calculated by an iterative formula. If a longitudinal travel distance s.sub.n-1 corresponding to the (n−1)-th point is known, and the coordinates of the (n−1)-th point and the n-th point are (x.sub.n-1, y.sub.n-1) and (x.sub.n, y.sub.n) respectively, then the travel path s.sub.n corresponding to the n-th point may be obtained by formula (8):
[0050] According to an embodiment, after the tuple corresponding to each point on the lane guiding line is obtained by means of the above calculation, a corresponding tuple may be defined for each lane guiding line, as shown in formula (9)
NaviPathTuple=(LineIndex,LeftWidth,RightWidth,PAthDataPrt) (9)
[0051] Where, LineIndex may denote an index value (LineIndex=0, 1, 2 . . . ) of the lane guiding line in all lane guiding lines. LeftWidth and RightWidth may respectively denote the lateral distances between the lane guiding line and the left and right boundaries of the lane where the lane guiding line is located. PathDataPtr may denote a data pointer of the tuple (x, y, s, θ, κ, κ′) corresponding to each point on the lane guiding line.
[0052] According to an embodiment, a lane guiding line list may be established on the base of the tuple of each lane guiding line, and data of one or more recorded lane guiding lines based on GPS may be stored in the lane guiding line list.
[0053]
[0054] In step 401, it may be determined whether the number of collected lane guiding lines included in the lane guiding line list is 0 (zero). The lane guiding line list includes the collected lane guiding lines presented by pre-recorded GPS trajectory data.
[0055] When the number of the lane guiding lines in the lane guiding line list is 0, jump to step 412, which is equivalent to the case that the GPS signals are unstable, and a real-time relative map is generated by using the vehicle sensor. That is, the real-time relative map is generated on the basis of the lane conditions collected by the lane sensor.
[0056] When the number of the collected lane guiding lines in the lane guiding line list is not 0, then in step 402, it may be determined whether the number of the collected lane guiding lines in the lane guiding line list is greater than one.
[0057] If the number of the collected lane guiding lines in the lane guiding line list is greater than one, then in step 403, the y-axis coordinates in the FLU coordinate system of the points, for example, the starting points, with a same serial number on each of the collected lane guiding lines, are sorted.
[0058] According to an embodiment, the FLU coordinates of the initial position of the vehicle are (0, 0), the y-axis coordinate of the guiding line on the left of the vehicle traveling direction is positive, and the farther the lane guiding line is from the vehicle, the greater the absolute value of the y-axis coordinate thereof. The y-axis coordinate of the guiding line on the right of the vehicle traveling direction is negative, and the farther the lane guiding line is from the vehicle, the greater the absolute value of the y-axis coordinate thereof. Therefore, the above-mentioned collected lane guiding lines can be sorted from left to right relative to the vehicle traveling direction in an order from positive to negative y-axis coordinates of the corresponding points.
[0059] In some cases, during recording, the sequence numbers of the lane guiding lines may be stored in the tuple of each lane guiding line in the lane guiding line list, but in actual applications or during automatic driving, data sending and receiving may be performed without following the sequence numbers of the lane guiding lines. Therefore, in a process of generating the real-time relative map or during the automatic driving, sorting the guiding lines based on the steps of the above method can avoid a disorder of the lane sequence caused by unsorted pre-recorded GPS data or by errors occurring during data sending.
[0060] If the number of the collected lane guiding lines in the lane guiding line list is one, jump to step 404.
[0061] In step 404, if there is only one collected lane guiding line in the lane guiding line list, one or more derivative lane guiding lines may be developed according to the one collected lane guiding line. The derivative lane guiding line refers to the lane guiding line obtained by calculation instead of by collecting and recording data.
[0062] For road sections in more complicated conditions, the GPS trajectory of each lane needs to be recorded to generate the real-time relative map based on the dynamic conditions of a plurality of recorded GPS trajectories, which is not only time-consuming but also difficult to implement. For example, long-term occupation of an overtaking lane is not allowed by road traffic rules, so it may be more difficult to record complete GPS trajectory data of the overtaking lane. In addition, for highways and other roads with regular shapes, the width, curvature and other parameters of each lane are basically identical, so GPS trajectory data of only one lane may be recorded, and then the real-time relative map with a plurality of lanes, which are developed according to the only one lane, can be generated, to reduce cost and difficulty of generating multiple guiding lines.
[0063] Specifically, the collected lane guiding line in the lane guiding line list may be the guiding line of a center lane. FLU coordinates of a point on the guiding line may be expressed as (x.sub.center, y.sub.center), and the heading of the point may be expressed as θ.sub.center A width of a left lane of the collected lane guiding line is W.sub.left, and a width of a right lane thereof is W.sub.right. The coordinates of points on the lane guiding lines of the derived lanes, which are developed to the left and to the right by a lane width, may be denoted by (x.sub.left, y.sub.left) and(x.sub.right, y.sub.right), respectively, where
[0064] Optionally, in step 405, a projection point of the vehicle on each lane guiding line may be determined, and the real-time relative map is generated according to one lane guiding line, which takes the projection point as a starting point and extends along the traveling direction of the vehicle.
[0065]
ProjIndexPair=(ProjIndex,ProjDis) (12)
[0066] As shown in the figure, ProjIndex in the formula (12) denotes an index number of the projection point of the current position of the vehicle on the guiding line (0≤ProjIndex≤N−1, assuming that there are N points on the guiding line, and N is a positive integer greater than 1), and ProjDis denotes a projection distance from the current position of the vehicle to the guiding line.
[0067] Based on the above steps, it is unnecessary to scan from the starting point of the lane guiding line during generation of the real-time relative map, thus improving communication efficiency and reducing computation complexity, thereby increasing the speed of generating the real-time relative map.
[0068] Optionally, in step 406, it may be determined whether there is a gap between a front and a rear two collected lane guiding lines on a travel path of the vehicle. If there is no gap, jump directly to step 412.
[0069] If there is a gap, in step 407, it can be determined whether there is a point pair, points of which are located on the front collected or derived lane guiding line and the rear collected or derived lane guiding line respectively, and have a distance therebetween less than a preset threshold.
[0070] If there is no such point pair, then in step 408, the front and the rear lane guiding lines are transitionally connected by means of calculation.
[0071] If there is at least one such point pair, in step 409, select one point pair, points of which have the minimum distance from each other. When the vehicle reaches one point of the selected point pair, which is located on the front lane guiding line, the real-time relative map is generated in subsequent applications based on the rear lane guiding line.
[0072] In a demonstration application of the autonomous driving, a complete recorded GPS trajectory may be used to generate the guiding line. But for actual driving applications, especially for long-distance autonomous driving, it is obviously impossible to generate the real-time relative map by using only one guiding line, and a plurality of guiding lines are necessary for providing a complete path guidance. Therefore, a smooth connection between the front and rear guiding lines has become an urgent problem to be solved.
[0073] In the process of recording or collecting the GPS trajectory data, the GPS trajectory data may be recorded by segment according to a specified time interval (for example, 3 minutes, which can be modified through a configuration file) or a specified distance (for example, 10 kilometers, which can be modified through the configuration file). According to an embodiment, each segment of such long-distance trajectory data may include several small data intervals, and the data of each segment may correspond to lane sensor data within a distance of the vehicle traveling for, for example, 8 seconds, or within a distance within 250 meters. In order to obtain the real-time guiding line, a length of each frame (for example, a generation frequency is 10 Hz, that is, 0.1 second per frame) of the guiding line cannot be greater than a specified length (according to the analysis in the first section, the configured specified length of 250 meters may meet the requirements of most autopilot requirements), and a lane guiding line with a length of 250 meters is continuously and circularly generated in real time every 0.1 second. Due to the actual situations during collecting and recording data, there may be data gaps between the front and the rear two guiding lines. In this case, as shown in
[0074] During the application of the autonomous driving, before reaching the point S1, the vehicle generates the real-time relative map by means of the first guiding line. When reaching the point S1 on the first guiding line, the vehicle switches to the point S2 on the second guiding line and generates the real-time relative map based on the second guiding line.
[0075] Optionally, in step 410, it may be determined whether adjacent lane lines overlap.
[0076] When collecting the GPS trajectory data, the collecting vehicle may deviate from the center line of the lane, which may cause the lane lines of adjacent lanes that should overlap with each other to be separated from each other.
[0077] If the adjacent lane lines do not overlap, then in step 411, one of the following ways is adopted to determine a shared lane line to generate the real-time relative map:
[0078] Taking a central line between the adjacent lane lines as the shared lane line;
[0079] Taking one of the adjacent lane lines that is proximate to the vehicle as the shared lane line; and
[0080] Taking one of the adjacent lane lines that is away from the vehicle as the shared lane line.
[0081] In step 412, a merged lane guiding line P.sub.merge and the real-time relative map are generated by combining conditions of GPS signals and a confidence level of the lane lines obtained by the vehicle sensor.
[0082] As shown in formula (13), P.sub.gps denotes the guiding line generated according to GPS data, and P.sub.sensor denotes the guiding line generated by means of the vehicle sensor. If the confidence level Conf(P.sub.sensor) of the data obtained by vehicle sensor is greater than or equal to a given threshold T.sub.1 (for example, T.sub.1≥0.9, which can be modified through the configuration file), then P.sub.gps and P.sub.sensor are weighted to generate the merged lane guiding line, for example, the weighting coefficient W (for example, W≥0.8, which can be modified through the configuration file) can be used.
[0083] If the confidence level Conf(P.sub.sensor) of the data acquired by the vehicle sensor is greater than or equal to the given threshold T.sub.2 (for example, T.sub.2>0.6, which can be modified through the configuration file) and the GPS signal is unstable, then the lane guiding line may be directly generated by means of P.sub.sensor.
[0084] If the GPS signals are stable and the confidence level of the data acquired by the vehicle sensor satisfies Conf(P Sensor) sensor)<T.sub.2 (for example, T.sub.2≥0.6), then the lane guiding line may be generated directly by means of P.sub.gps.
[0085] If there are other situations other than those listed above, then they can be considered as error states, and the vehicle may be controlled to stop immediately or exit from the automatic driving mode. The above arrangement can not only solve the problem that the GPS is disturbed or that there is no signal, but also avoid the instability caused by relying too much on the vehicle sensor.
[0086] Optionally, in step 413, restrictions corresponding to traffic rules, such as speed limit, prohibition of overtaking, etc., may also be added to the real-time relative map.
[0087]
[0088] An embodiment of the present application also provides a computer storage medium, for example, including a storage storing a computer program, and the computer program may be executed by a processor to perform steps of the method for generating the real-time relative map provided by any embodiment of the present application. The computer storage medium can be FRAM, ROM, PROM, EPROM, EEPROM, Flash Memory, magnetic surface memory, optical disk, or CD-ROM, etc., and it may also be any device including one or any combination of the foregoing memories.
[0089] The technical features of the above embodiments may be combined arbitrarily. In order to make the description concise, not all possible combinations of the technical features in the above embodiments are described. However, as long as there is no contradiction in the combination of these technical features, the combinations should be within the range described in the specification.
[0090] The above-mentioned examples are only several embodiments of the present application, and the description thereof is relatively specific and detailed, but should not be understood to be limitations on the scope of the patent application. It should be noted that, for those of ordinary skill in the art, several modifications and improvements can be made without departing from the concept of this application, and all these modifications and improvements fall within the protection scope of this application. Therefore, the protection scope of the present invention shall be subject to the appended claims.