METHOD FOR LOCALIZING AND/OR MAPPING DURING OPERATION OF A VEHICLE IN AN ENVIRONMENT

20230266471 · 2023-08-24

Assignee

Inventors

Cpc classification

International classification

Abstract

A method for localizing and/or mapping during operation of a vehicle an environment is provided. The vehicle comprises at least one environment perception sensor for the localizing and/or mapping. The method includes obtaining at least one measurement of the environment from the environment perception sensor, searching for and identifying a plurality of objects in the at least one measurement which correspond to a predefined pattern, in response to determining that the plurality of objects correspond to the predefined pattern, filtering the at least one measurement so that only the plurality of objects which correspond to the predefined pattern are used for the localizing and/or mapping, and localizing and/or mapping based on the filtered at least one measurement.

Claims

1. A method for localizing and/or mapping during operation of a vehicle in an environment, wherein the vehicle comprises at least one environment perception sensor for the localizing and/or mapping, the method comprising: obtaining at least one measurement of the environment from the environment perception sensor, searching for and identifying a plurality of objects in the at least one measurement which correspond to a predefined pattern, in response to determining that the plurality of objects correspond to the predefined pattern, filtering the at least one measurement so that only the plurality of objects which correspond to the predefined pattern are used for the localizing and/or mapping, and localizing and/or mapping based on the filtered at least one measurement.

2. The method according to claim 1, wherein the predefined pattern is defined by a plurality of static objects.

3. The method according to claim 1, wherein the predefined pattern is defined by a plurality of separate objects which are offset from each other.

4. The method according to claim 3, wherein the separate objects are offset from each other by a minimum distance and/or by a maximum distance.

5. The method according to claim 1, wherein the predefined pattern is defined by a predefined shape of each one of the plurality of objects.

6. The method according to claim 5, wherein the predefined shape is defined by at least one of the following: a height of an object from a ground surface, a width of an object, a length of an object, a radius or radii profile of an object.

7. The method according to claim 5, wherein the predefined shape is at least partly any one of a cylinder shape, such as a pole, a cone shape, a spherical shape, a circular or oval shape, a triangle shape, a rectangle shape and a square shape.

8. The method according to claim 1, wherein identifying the plurality of objects in the at least one measurement comprises identifying a machine-readable visual identifier on at least one of the plurality of objects which fulfils a predefined identification criterion.

9. The method according to claim 8, wherein the predefined identification criterion is at least one of a predefined one-dimensional pattern, such as a predefined barcode, and/or a predefined two-dimensional pattern, such as a predefined QR code.

10. The method according to claim 1, wherein the predefined pattern is defined by at least three objects.

11. The method according to claim 10, wherein the at least three objects form a triangle pattern or a substantially linear pattern.

12. The method according to claim 1, wherein the predefined pattern is defined in that at least one object of the plurality of objects is surrounded by a free space which fulfils a predefined free space criterion.

13. A control unit for localizing and/or mapping during operation of a vehicle in an environment, the control unit being configured to perform the steps of the method according to claim 1.

14. A vehicle comprising at least one environment perception sensor for localizing and/or mapping, wherein the vehicle further comprises the control unit according to claim

15. A computer program comprising program code for performing the method of claim 1 when said program is run on a computer.

16. A computer readable medium carrying a computer program comprising program code for performing the method of claim 1 when said program product is run on a computer.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

[0038] With reference to the appended drawings, below follows a more detailed description of embodiments of the invention cited as examples.

[0039] In the drawings:

[0040] FIG. 1 is a side view of a vehicle according to an example embodiment of the present invention,

[0041] FIG. 2 is a schematic and perspective view of a road section where the vehicle as shown in FIG. 1 is expected to travel,

[0042] FIG. 3 is a flowchart of a method according to an example embodiment of the invention,

[0043] FIG. 4 is a schematic view of an object forming part of a predefined pattern according to an example embodiment of the present invention,

[0044] FIGS. 5a and 5b are schematic views of predefined patterns according to example embodiments of the present invention.

DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS OF THE INVENTION

[0045] FIG. 1 depicts a vehicle 100 in the form of a truck. In the shown embodiment, the truck 100 is a towing truck for towing a trailer (not shown). It shall be understood that the invention is not restricted to only this type of vehicle, but may advantageously be used for other vehicles, including but not limited to other trucks, buses, construction equipment, such as wheel loaders, dump trucks, excavators etc. The vehicle 100 is preferably an autonomous vehicle using SLAM for localizing and/or mapping during operation. The vehicle 100 may also be equipped with other means for localizing the vehicle 100, such as GNSS. However, GNSS may not always function properly. Thereby, by using localizing and/or mapping according to the method as disclosed herein, a more reliable and robust method is achieved. This also implies redundancy, i.e. more than one procedure may be used for the localizing and/or mapping.

[0046] In the shown embodiment, the vehicle 100 comprises an environment perception sensor 110. The shown sensor 110 is here a LIDAR. However, as mentioned in the above, also other types of sensors may be used. The vehicle 100 typically comprises more than one environment perception sensor for obtaining a measurement which results in a representation of the environment.

[0047] In the shown embodiment, the vehicle 100 further comprises a control unit 120. The control unit is configured to perform the steps of a method according to the first aspect of the invention. An embodiment of the method will be described in the below with respect to the flowchart shown in FIG. 3.

[0048] The control unit 120 is an electronic control unit and may comprise processing circuitry which is adapted to run a computer program as disclosed herein. The control unit 120 may comprise hardware and/or software for performing the method according to the invention. In an embodiment the control unit 120 may be denoted a computer. The control unit 120 may be constituted by one or more separate sub-control units. In addition, the control unit 120 may communicate with the sensor 110 by use of wired and/or wireless communication means. The control unit 120 is herein part of the vehicle 100 as shown in FIG. 1. Alternatively, the control unit 120 may be part of another vehicle entity. Still further, even though the control unit 120 preferably is a vehicle on-board control unit 120, it shall be noted that the control unit may additionally or alternatively be a vehicle off-board control unit, such as a control unit being part of a computer cloud system.

[0049] With respect to especially FIGS. 1, 2 and 3, a method according to an example embodiment of the invention will be described. FIG. 2 illustrates a perspective view of a road section RS along which the vehicle 100 is intended to be guided by use of SLAM. At a side of the road section RS, a plurality of objects 10, 20, 30 can be seen.

[0050] 7. The method comprises:

[0051] 8. S1: obtaining at least one measurement of the environment from the environment perception sensor 110. The measurement performed by the LIDAR sensor 110 is at least one scan of the environment.

[0052] 9.

[0053] 10. The method further comprises

[0054] 11. S2: searching for and identifying a plurality of objects 10, 20, 30 in the at least one measurement which correspond to a predefined pattern, and

[0055] 12. —in response to determining that the plurality of objects correspond to the predefined pattern, S3: filtering the at least one measurement so that only the plurality of objects which correspond to the predefined pattern are used for the localizing and/or mapping.

[0056] 13.

[0057] 14. The method further comprises:

[0058] 15. S4: localizing and/or mapping based on the filtered at least one measurement.

[0059] In FIG. 2, which is a schematic and simplified illustration of the environment, only the three objects 10, 20, 20 forming the predefined pattern are shown. However, it shall be understood that the at least one measurement will include a large amount of different objects, elements etc. (not shown). Accordingly, by use of the method, other objects, elements etc. which are not part of the predefined pattern are filtered out. Thereby, a filtered measurement, or scan, may be provided, as represented in FIG. 2. As shown, the predefined pattern is here defined by three objects, 10, 20, 30 which are offset from each other by respective distances d1, d2.

[0060] By use of the method, a more reliable and robust localizing and/or mapping of the environment is achieved, implying e.g. improved position/pose accuracy. Consequently, as a result, the accuracy during guiding of the vehicle 100 along the road section RS can be improved.

[0061] According to an example embodiment of the invention, the method may further comprise guiding the vehicle 100 from one point to another point by use of the localizing and/or mapping as disclosed herein.

[0062] 16. As shown in e.g. FIG. 2, the predefined pattern may be defined by a plurality of static objects 10, 20, 30. Accordingly, the method comprises searching for and identifying a plurality of static objects 10, 20, 30 in the at least one measurement, i.e. objects which are not moving. Thereby, moving objects, such as other road users, e.g. vehicles, pedestrians etc., can be filtered out.

[0063] 17.

[0064] 18. As further shown in e.g. FIG. 2, the predefined pattern may be defined by a plurality of separate objects 10, 20, 30 which are offset from each other. Accordingly, the plurality of objects 10, 20, 30 may be separated from each other by respective distances as already mentioned in the above. In the embodiment shown in FIG. 2, the objects 10 and 20 are separated by a distance d1 and the objects 20 and 30 are separated by a distance d2. The separate objects 10, 20, 30 may be offset from each other by a minimum distance and/or by a maximum distance. Accordingly, by way of example, each one of the distances d1 and d2 may fall within a predefined range which is defined by the minimum distance and the maximum distance. For example, the minimum distance may be 0.5 m and the maximum distance may be 2 m, such as a predefined range of 0.5-1 m.

[0065] 19.

[0066] 20. FIG. 4 depicts one of the objects 10 which are part of the predefined pattern as shown in FIG. 2. FIG. 4 further shows that the predefined pattern may additionally or alternatively be identified by specific characteristics of each object.

[0067] 21.

[0068] 22. For example, the predefined pattern may be defined by a predefined shape of each one of the plurality of objects 10, 20, 30. FIG. 4 depicts that the object 10 has a specific height h from a ground surface, a width w and a radius r. Additionally, or alternatively, the predefined shape may be defined by a length of the object.

[0069] 23.

[0070] 24. In the shown embodiment, the predefined shape is a cylinder shape. More specifically, the object 10 as shown in FIG. 4 is a pole. However, also other shapes are conceivable, such as a cone shape, a spherical shape, a circular or oval shape, a triangle shape, a rectangle shape and a square shape. For example, it may be advantageous to use a shape which does not resemble other elements in nature, e.g. trees, rocks, stones etc.

[0071] 25.

[0072] 26. Additionally, or alternatively, at least one of the objects 10 may comprise a machine-readable visual identifier 12. Accordingly, identifying the plurality of objects 10, 20, 30 in the at least one measurement may comprises identifying the machine-readable visual identifier 12 on at least one of the plurality of objects which fulfils a predefined identification criterion. In the shown embodiment, the predefined identification criterion is a predefined one-dimensional pattern, in this case a predefined barcode. Also other machine-readable visual identifiers are conceivable, such as a QR code (not shown) with a predefined two-dimensional pattern.

[0073] 27.

[0074] 28. FIGS. 5a and 5b show schematic views from above of a plurality of objects 10, 20, which correspond to a predefined pattern according to the invention. In both illustrations, the predefined pattern is defined by three separate objects which are offset from each other.

[0075] 29.

[0076] 30. With reference to FIG. 5a, the three objects 10, 20, 30 may form a substantially linear pattern. For example, the substantially linear pattern may be defined by a corridor, or aisle, C which has a width corresponding to a predefined distance d3. Accordingly, the plurality of objects 10, 20, 30 are considered to be part of the predefined pattern if they are provided in the corridor C. By allowing the corridor C to have a relatively large width d3, such as 1-2 m, a user may place the objects 10, 20, 30 with a relatively low accuracy along an imaginary line. Furthermore, by also allowing the distances d1 and d2 to have a span which fulfils a predefined range as mentioned in the above, the user may only need to make a very rough estimation when placing the objects in the environment.

[0077] 31.

[0078] 32. As another example, with reference to FIG. 5b, the three objects 10, 20, 30 may form a triangle pattern. Relative distances, d4, d5, d6 between the objects 10, 20, 30 may also fall within a predefined range as mentioned in the above.

[0079] 33.

[0080] 34. The predefined pattern may additionally or alternatively be defined in that at least one object of the plurality of objects 10, 20, 30 is surrounded by a free space which fulfils a predefined free space criterion. For example, a user may place the objects so that they are surrounded by a free space which fulfils the predefined free space criterion. Thereby it may be easier for the at least one environment perception sensor 110 to identify the objects 10, 20, 30 when searching for the objects in the at least one measurement.

[0081] 35.

[0082] 36. It shall be understood that any one or any combination of the above-mentioned characteristics may be used for identifying the predefined pattern.

[0083] It is to be understood that the present invention is not limited to the embodiments described above and illustrated in the drawings; rather, the skilled person will recognize that many changes and modifications may be made within the scope of the appended claims.