METHOD FOR LOCALIZING AND/OR MAPPING DURING OPERATION OF A VEHICLE IN AN ENVIRONMENT
20230266471 · 2023-08-24
Assignee
Inventors
Cpc classification
E01F9/30
FIXED CONSTRUCTIONS
G06V10/255
PHYSICS
G06V20/56
PHYSICS
International classification
Abstract
A method for localizing and/or mapping during operation of a vehicle an environment is provided. The vehicle comprises at least one environment perception sensor for the localizing and/or mapping. The method includes obtaining at least one measurement of the environment from the environment perception sensor, searching for and identifying a plurality of objects in the at least one measurement which correspond to a predefined pattern, in response to determining that the plurality of objects correspond to the predefined pattern, filtering the at least one measurement so that only the plurality of objects which correspond to the predefined pattern are used for the localizing and/or mapping, and localizing and/or mapping based on the filtered at least one measurement.
Claims
1. A method for localizing and/or mapping during operation of a vehicle in an environment, wherein the vehicle comprises at least one environment perception sensor for the localizing and/or mapping, the method comprising: obtaining at least one measurement of the environment from the environment perception sensor, searching for and identifying a plurality of objects in the at least one measurement which correspond to a predefined pattern, in response to determining that the plurality of objects correspond to the predefined pattern, filtering the at least one measurement so that only the plurality of objects which correspond to the predefined pattern are used for the localizing and/or mapping, and localizing and/or mapping based on the filtered at least one measurement.
2. The method according to claim 1, wherein the predefined pattern is defined by a plurality of static objects.
3. The method according to claim 1, wherein the predefined pattern is defined by a plurality of separate objects which are offset from each other.
4. The method according to claim 3, wherein the separate objects are offset from each other by a minimum distance and/or by a maximum distance.
5. The method according to claim 1, wherein the predefined pattern is defined by a predefined shape of each one of the plurality of objects.
6. The method according to claim 5, wherein the predefined shape is defined by at least one of the following: a height of an object from a ground surface, a width of an object, a length of an object, a radius or radii profile of an object.
7. The method according to claim 5, wherein the predefined shape is at least partly any one of a cylinder shape, such as a pole, a cone shape, a spherical shape, a circular or oval shape, a triangle shape, a rectangle shape and a square shape.
8. The method according to claim 1, wherein identifying the plurality of objects in the at least one measurement comprises identifying a machine-readable visual identifier on at least one of the plurality of objects which fulfils a predefined identification criterion.
9. The method according to claim 8, wherein the predefined identification criterion is at least one of a predefined one-dimensional pattern, such as a predefined barcode, and/or a predefined two-dimensional pattern, such as a predefined QR code.
10. The method according to claim 1, wherein the predefined pattern is defined by at least three objects.
11. The method according to claim 10, wherein the at least three objects form a triangle pattern or a substantially linear pattern.
12. The method according to claim 1, wherein the predefined pattern is defined in that at least one object of the plurality of objects is surrounded by a free space which fulfils a predefined free space criterion.
13. A control unit for localizing and/or mapping during operation of a vehicle in an environment, the control unit being configured to perform the steps of the method according to claim 1.
14. A vehicle comprising at least one environment perception sensor for localizing and/or mapping, wherein the vehicle further comprises the control unit according to claim
15. A computer program comprising program code for performing the method of claim 1 when said program is run on a computer.
16. A computer readable medium carrying a computer program comprising program code for performing the method of claim 1 when said program product is run on a computer.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0038] With reference to the appended drawings, below follows a more detailed description of embodiments of the invention cited as examples.
[0039] In the drawings:
[0040]
[0041]
[0042]
[0043]
[0044]
DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS OF THE INVENTION
[0045]
[0046] In the shown embodiment, the vehicle 100 comprises an environment perception sensor 110. The shown sensor 110 is here a LIDAR. However, as mentioned in the above, also other types of sensors may be used. The vehicle 100 typically comprises more than one environment perception sensor for obtaining a measurement which results in a representation of the environment.
[0047] In the shown embodiment, the vehicle 100 further comprises a control unit 120. The control unit is configured to perform the steps of a method according to the first aspect of the invention. An embodiment of the method will be described in the below with respect to the flowchart shown in
[0048] The control unit 120 is an electronic control unit and may comprise processing circuitry which is adapted to run a computer program as disclosed herein. The control unit 120 may comprise hardware and/or software for performing the method according to the invention. In an embodiment the control unit 120 may be denoted a computer. The control unit 120 may be constituted by one or more separate sub-control units. In addition, the control unit 120 may communicate with the sensor 110 by use of wired and/or wireless communication means. The control unit 120 is herein part of the vehicle 100 as shown in
[0049] With respect to especially
[0050] 7. The method comprises:
[0051] 8. S1: obtaining at least one measurement of the environment from the environment perception sensor 110. The measurement performed by the LIDAR sensor 110 is at least one scan of the environment.
[0052] 9.
[0053] 10. The method further comprises
[0054] 11. S2: searching for and identifying a plurality of objects 10, 20, 30 in the at least one measurement which correspond to a predefined pattern, and
[0055] 12. —in response to determining that the plurality of objects correspond to the predefined pattern, S3: filtering the at least one measurement so that only the plurality of objects which correspond to the predefined pattern are used for the localizing and/or mapping.
[0056] 13.
[0057] 14. The method further comprises:
[0058] 15. S4: localizing and/or mapping based on the filtered at least one measurement.
[0059] In
[0060] By use of the method, a more reliable and robust localizing and/or mapping of the environment is achieved, implying e.g. improved position/pose accuracy. Consequently, as a result, the accuracy during guiding of the vehicle 100 along the road section RS can be improved.
[0061] According to an example embodiment of the invention, the method may further comprise guiding the vehicle 100 from one point to another point by use of the localizing and/or mapping as disclosed herein.
[0062] 16. As shown in e.g.
[0063] 17.
[0064] 18. As further shown in e.g.
[0065] 19.
[0066] 20.
[0067] 21.
[0068] 22. For example, the predefined pattern may be defined by a predefined shape of each one of the plurality of objects 10, 20, 30.
[0069] 23.
[0070] 24. In the shown embodiment, the predefined shape is a cylinder shape. More specifically, the object 10 as shown in
[0071] 25.
[0072] 26. Additionally, or alternatively, at least one of the objects 10 may comprise a machine-readable visual identifier 12. Accordingly, identifying the plurality of objects 10, 20, 30 in the at least one measurement may comprises identifying the machine-readable visual identifier 12 on at least one of the plurality of objects which fulfils a predefined identification criterion. In the shown embodiment, the predefined identification criterion is a predefined one-dimensional pattern, in this case a predefined barcode. Also other machine-readable visual identifiers are conceivable, such as a QR code (not shown) with a predefined two-dimensional pattern.
[0073] 27.
[0074] 28.
[0075] 29.
[0076] 30. With reference to
[0077] 31.
[0078] 32. As another example, with reference to
[0079] 33.
[0080] 34. The predefined pattern may additionally or alternatively be defined in that at least one object of the plurality of objects 10, 20, 30 is surrounded by a free space which fulfils a predefined free space criterion. For example, a user may place the objects so that they are surrounded by a free space which fulfils the predefined free space criterion. Thereby it may be easier for the at least one environment perception sensor 110 to identify the objects 10, 20, 30 when searching for the objects in the at least one measurement.
[0081] 35.
[0082] 36. It shall be understood that any one or any combination of the above-mentioned characteristics may be used for identifying the predefined pattern.
[0083] It is to be understood that the present invention is not limited to the embodiments described above and illustrated in the drawings; rather, the skilled person will recognize that many changes and modifications may be made within the scope of the appended claims.