METHOD FOR GENERATING A MAP OF AN AREA TO BE USED FOR GUIDING A VEHICLE

20220333949 · 2022-10-20

Assignee

Inventors

Cpc classification

International classification

Abstract

A method for generating a map of an area to be used for guiding a vehicle in the area by use of at least one vehicle guiding sensor, the method comprising obtaining sensor accuracy data comprising information about vehicle guiding sensor accuracy problems which are associated with at least one vehicle position of a vehicle when the vehicle is guided in the area; based on the obtained sensor accuracy data, defining one or more low sensor accuracy zones relating to the vehicle position/s at which the vehicle guiding sensor accuracy problems were recognized; and generating the map based on the defined one or more low sensor accuracy zones.

Claims

1. A method for generating a map of an area to be used for guiding a vehicle in the area by use of at least one vehicle guiding sensor, the method comprising: obtaining sensor accuracy data comprising information about vehicle guiding sensor accuracy problems which are associated with at least one vehicle position of a vehicle when the vehicle is guided in the area; based on the obtained sensor accuracy data, defining one or more low sensor accuracy zones relating to the vehicle position/s at which the vehicle guiding sensor accuracy problems were recognized; and generating the map based on the defined one or more low sensor accuracy zones.

2. The method according to claim 1, wherein the information about vehicle guiding sensor accuracy problems are further associated with at least one of a vehicle state, an environmental condition and a time of day when the vehicle is guided in the area.

3. The method according to claim 1, wherein the sensor accuracy data is obtained from on-board vehicle guiding sensors of a plurality of vehicles which are guided in the area.

4. The method according to claim 1, wherein the sensor accuracy data is obtained from at least one off-board vehicle guiding sensor.

5. The method according to claim 1, wherein at least one of the one or more low sensor accuracy zones is defined by determining at least one entry point for the low sensor accuracy zone, which at least one entry point defines a limit between the low sensor accuracy zone and a high sensor accuracy zone outside of the low sensor accuracy zone.

6. The method according to claim 1, wherein the map is further generated by defining at least one high sensor accuracy zone outside of the one or more low sensor accuracy zones.

7. The method according to claim 1, wherein the sensor accuracy data relates to at least one of GNSS (Global Navigation Satellite System) data accuracy problems, articulation angle measurement problems and environmental perception sensor problems.

8. The method according to claim 1, wherein the vehicle guiding sensor accuracy problems are defined by at least one predetermined threshold value.

9. A control unit for generating a map of an area to be used for guiding a vehicle in the area by use of at least one vehicle guiding sensor, wherein the control unit is configured to perform the steps of the method according to claim 1.

10. A method for guiding a vehicle in an area, comprising: guiding the vehicle in the area by use of a map generated according to claims 1, and issuing a warning signal when the vehicle is entering or approaching at least one of the one or more low sensor accuracy zones.

11. A control unit for guiding a vehicle in an area, wherein the control unit is configured to perform the steps of the method according to claim 10.

12. A vehicle, wherein the vehicle is configured to be guided by use of the control unit according to claim 11 and by at least one vehicle guiding sensor.

13. The vehicle according to claim 12, wherein the vehicle is an articulated vehicle combination comprising at least one articulation joint.

14. A computer program comprising instructions for causing a control unit to perform the steps of the method according to claim 1.

15. A computer readable medium comprising instructions for causing a control unit to perform the steps of the method according to claim 1.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

[0033] With reference to the appended drawings, below follows a more detailed description of embodiments of the invention cited as examples.

[0034] In the drawings:

[0035] FIG. 1 is a side view of a vehicle according to an embodiment of the invention;

[0036] FIG. 2 is a flowchart of a method according to an embodiment of the first aspect of the invention;

[0037] FIG. 3 is a schematic view of an area comprising low sensor accuracy zones; and

[0038] FIG. 4 is a flowchart of a method according to an embodiment of the third aspect of the invention.

DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS OF THE INVENTION

[0039] FIG. 1 depicts a side view of a vehicle 1 in the form of heavy-duty truck and trailer combination. Accordingly, the vehicle 1 is here an articulated vehicle combination, and comprises a truck 11 and a trailer 12, which here is a so-called semi-trailer. The articulated vehicle combination 1 comprises an articulation joint A1. The vehicle 1 may use a map as will be described in the below, and/or the vehicle 1 may be used for generating such a map. The articulated vehicle combination 1 may further comprise at least one on-board vehicle guiding sensor (not shown) for guiding the vehicle combination 1. Additionally, or alternatively, the vehicle combination 1 may be guided by one or more off-board vehicle guiding sensors (not shown).

[0040] FIG. 3 shows a schematic view from above of an area A in which a vehicle 1 is to be guided by use of one or more vehicle guiding sensors 2, 3. The vehicle 1 as shown in FIG. 3 is also here an articulated vehicle combination and comprises a truck 11′ and a trailer 12′. This vehicle 1 comprises a first and a second articulation joint A1′, A2, and may be denoted a Nordic combination. The vehicle, in this case the truck 11′, comprises a control unit 10. The vehicle guiding sensor 2 is an on-board camera which is mounted at a rearward facing surface of the trailer 12′. A vehicle guiding sensor, such as the camera 2, may additionally or alternatively be mounted on the truck 11′. The vehicle guiding sensors 3 are off-board cameras which are directed so that they can observe at least parts of the area A. The vehicle 1 may comprise further vehicle guiding sensors, such as the above-mentioned GNSS sensor and articulation angle measurement sensors (not shown).

[0041] With reference to especially FIG. 2 and FIG. 3, a method according to example embodiments of the first aspect of the invention will be described.

[0042] The method according to the first aspect is a method for generating a map of an area A to be used for guiding a vehicle 1 in the area A by use of at least one vehicle guiding sensor 2, 3. The method comprises:

[0043] S1: obtaining sensor accuracy data comprising information about vehicle guiding sensor accuracy problems which are associated with at least one vehicle position of a vehicle 1 when the vehicle 1 is guided in the area A. For example, when the vehicle 1 is guided in the area A along a travelling path T by use of e.g. the vehicle guiding sensor 2, the vehicle 1 may experience a sensor accuracy problem for the vehicle guiding sensor 2 when entering a zone Z.sub.2. The sensor accuracy problem may for example be that the camera 2 is obstructed by sun glare in the zone Z.sub.2. A vehicle position may for example be defined by coordinates in a coordinate system.

[0044] The method further comprises:

[0045] S2: based on the obtained sensor accuracy data, defining one or more low sensor accuracy zones Z.sub.1, Z.sub.2, Z.sub.3 relating to the vehicle position/s at which the vehicle guiding sensor accuracy problems were recognized; and

[0046] S3: generating the map based on the defined one or more low sensor accuracy zones Z.sub.1, Z.sub.2, Z.sub.3.

[0047] Accordingly, when e.g. the vehicle 1, or any other vehicle (not shown), is guided in the area A, it may experience sensor accuracy problems of one or more vehicle guiding sensors, which problems are associated with a position of the vehicle in the area A. This data is then used for defining one or more low sensor accuracy zones Z.sub.1, Z.sub.2, Z.sub.3 which relate to the vehicle positions at which the vehicle guiding sensor accuracy problems were recognized. For example, in addition to the above problem related to zone Z.sub.2, the zones Z.sub.1 and Z.sub.3 may relate to GNSS sensor accuracy problems, e.g. in these zones the position information fluctuates too much, or no position information is obtained at all. The fluctuations may reach or exceed a predetermined threshold value, above which threshold value the fluctuations will negatively affect the vehicle guiding.

[0048] The information about vehicle guiding sensor accuracy problems for any one of the zones Z.sub.1, Z.sub.2, Z.sub.3 may further be associated with at least one of a vehicle state, an environmental condition and a time of day when the vehicle is guided in the area A. For example, the above-mentioned sun glare in zone Z.sub.2 may be experienced at a specific time of day and/or during specific weather conditions, e.g. during daytime a sunny day. In addition, the sun glare for e.g. the camera 2 may only be related to a specific orientation of the vehicle 1, i.e. at a specific vehicle state. Thereby, by also associating this information with the vehicle positions, a further improved map may be generated. In particular, when guiding a vehicle 1 by use of the generated map along e.g. the travelling path T, it can be determined in advance if, when and where a sensor accuracy problem may occur for e.g. the vehicle guiding sensor 2. Accordingly, this information can be used for recognizing sensor accuracy problems for any vehicle guiding sensor. Thereby, the generated map may be used for guiding a vehicle 1 by similar and/or by the same type of vehicle guiding sensors which are associated with the generated map. According to an example embodiment, the vehicle 1 which is guided by use of the map comprises at least one vehicle guiding sensor which is of the same type, or of a similar type, and which also is mounted at the same, or at a similar, position, as the vehicle guiding sensor(s) which experienced the sensor accuracy problem when the map was generated.

[0049] The sensor accuracy data may be obtained from on-board vehicle guiding sensors 2 of for example a plurality of vehicles which are guided in the area A, and/or the sensor accuracy data may be obtained from at least one off-board vehicle guiding sensor 3 when one or more vehicles are guided in the area A. As such, by recording sensor accuracy data—by the aforementioned vehicle guiding sensors 2, 3, and/or by any other vehicle guiding sensors—when several vehicles are guided in the area A, a more reliable map can be generated.

[0050] Furthermore, at least one of the one or more low sensor accuracy zones Z.sub.1, Z.sub.2, Z.sub.3 may be defined by determining at least one entry point Z.sub.1e, Z.sub.2e, Z.sub.3e for the low sensor accuracy zone, which at least one entry point Z.sub.1e, Z.sub.2e, Z.sub.3e defines a limit between the low sensor accuracy zone Z.sub.1, Z.sub.2, Z.sub.3 and a high sensor accuracy zone Z.sub.HA outside of the low sensor accuracy zone Z.sub.1, Z.sub.2, Z.sub.3. In the FIG. 3 embodiment the entry point of each zone is at least one point on the dashed line enclosing each one of the zones Z.sub.1, Z.sub.2, Z.sub.3. The dashed line for each zone Z.sub.1, Z.sub.2, Z.sub.3 is preferably defined by several entry points, whereby the border of each zone Z.sub.1, Z.sub.2, Z.sub.3 may e.g. be created/defined by interpolating between the determined entry points. It shall be understood that even though a visual representation of the area A and of its sensor accuracy zones Z.sub.1, Z.sub.2, Z.sub.3 is shown in FIG. 3, the generated map may rather only be configured as data for a computer, such as for any one of the control units disclosed herein. By recording sensor accuracy data from more than one vehicle and/or during more than one occasion, the confidence of the actual position of the entry point(s) Z.sub.1e, Z.sub.2e, Z.sub.3e can be increased.

[0051] The map may further be generated by defining at least one high sensor accuracy zone Z.sub.HA outside of the one or more low sensor accuracy zones Z.sub.1, Z.sub.2, Z.sub.3. This zone may for example be defined by subtracting the zones Z.sub.1, Z.sub.2, Z.sub.3 from the area A.

[0052] The area A may be any type of area, including but not limited to a terminal area, a harbour, a confined area, a street, a part of a city or the like.

[0053] The sensor accuracy data may as mentioned relate to at least one of GNSS (Global Navigation Satellite System) data accuracy problems, articulation angle measurement problems and environmental perception sensor problems. An environmental perception sensor may for example be a LIDAR, RADAR, SONAR, camera, and/or ultrasonic sensor.

[0054] The vehicle guiding sensor accuracy problems may be defined by at least one predetermined threshold value, as mentioned in the above. For example, a predetermined threshold value for a GNSS sensor, a LIDAR sensor or for any other vehicle guiding sensor may be determined, which value is indicative of a sensor accuracy problem.

[0055] When e.g. a camera or the like is used for determining an articulation angle of an articulated vehicle combination 1, the articulation angle may be determined by matching an image obtained by the camera with a stored image. The stored image may have been obtained during a calibration operation for the camera. If there is a difficulty in finding a match, this may be an indication of a sensor accuracy problem. For example, sun glare may result in that the obtained image cannot find a match with a stored image from the calibration operation.

[0056] Other non-limiting examples of sensor accuracy problems which may be identified and used for defining the low sensor accuracy zones Z.sub.1, Z.sub.2, Z.sub.3 are:

[0057] GNSS triangulation errors which exceed a predetermined threshold value,

[0058] GNSS position and/or heading which fluctuates,

[0059] number of available GNSS satellites is suddenly decreasing, and

[0060] articulation angle value which fluctuates.

[0061] A control unit 10 of e.g. the vehicle 1 may be used for generating the map of the area A. It shall however be understood that the control unit may be any type of control unit, also including off-board control units or computers.

[0062] The generated map may be generated over time by one or more vehicles which are guided in the area A. Thereby the map may also be updated over time, taking e.g. new circumstances into consideration. For example, a GNSS problem may occur after a new building has been built in the area A, and by continually updating the map, such modifications to the area A may be considered and included in the updated map.

[0063] With reference to FIG. 3 and FIG. 4, a method for guiding a vehicle 1 in an area A according to the third aspect will be described. The method comprises:

[0064] S10: guiding the vehicle 1 in the area A by use of a map generated according to any one of the embodiments of the first aspect of the invention, and

[0065] S20: issuing a warning signal when the vehicle 1 is entering or approaching at least one of the one or more low sensor accuracy zones Z.sub.1, Z.sub.2, Z.sub.3.

[0066] For example, as shown in FIG. 3, the travelling path T which the vehicle 1 is guided along is at least partly provided in the low sensor accuracy zone Z.sub.2. Thereby, a driver of the truck 11′ may be notified by the warning signal that the vehicle 1 during the vehicle guiding will enter into the zone Z.sub.2. Thereby, appropriate actions may be taken by the driver, such as taking over and manually controlling the vehicle 1 in the zone Z.sub.2, and/or avoiding entering the zone Z.sub.2. In the shown embodiment the vehicle 1 is reversing along the travelling path T. Additionally, or alternatively, the warning signal may be used for e.g. the control unit 10, or any other control unit, taking an appropriate action. For example, it may be recognized that a similar sensor as the vehicle guiding sensor 2 has experienced problems in the zone Z.sub.2. Thereby, an appropriate action may be to instead use another sensor, such as the vehicle guiding sensors 3, for the vehicle guiding. Another appropriate action may be to replan the travelling path T to a travelling path T′ such that the zone Z.sub.2 is avoided, or at least partially avoided.

[0067] The vehicle guiding sensors 3 are preferably connected to a control unit (not shown), or computer, which in turn is communicatively connected with e.g. the control unit 10 of the vehicle 1. The communication is in such an embodiment preferably wireless, such as via WiFi, Bluetooth, 3g, 4g, 5g or the like.

[0068] The control unit 10 is preferably performing the vehicle guiding method according to the third aspect, by e.g. issuing appropriate signals to steering actuators of the vehicle 1 such that any one of the travelling paths T or T′ is followed.

[0069] The above-mentioned methods may be implemented as one or more computer programs which are run by e.g. the control unit 10 and/or as a computer readable medium comprising instructions for causing e.g. the control unit 10 to perform any one of the methods.

[0070] The vehicles 1 in FIGS. 1 and 3 may be guided by use of ADAS, and/or they may be autonomously driven by use of the vehicle guiding sensors 2 and/or 3.

[0071] The control units as disclosed herein may include a microprocessor, a microcontroller, a programmable digital signal processor or another programmable device. Thus, the control unit 10 may comprise electronic circuits and connections (not shown) as well as processing circuitry (not shown) such that the control unit 10 can communicate with different parts of the vehicle 1 or with different control units of the vehicle 1, such as with various sensors, systems and control units, in particular with one or more electronic control units (ECUs) controlling electrical systems or subsystems in the vehicle 1. The control unit 10 may comprise modules in either hardware or software, or partially in hardware or software, and communicate using known transmission buses such a CAN-bus and/or wireless communication capabilities. The processing circuitry may be a general-purpose processor or a specific processor. The control unit 10 may comprise a non-transitory memory for storing computer program code and data. Thus, the skilled person realizes that the control unit 10 may be embodied by many different constructions.

[0072] It is to be understood that the present invention is not limited to the embodiments described above and illustrated in the drawings; rather, the skilled person will recognize that many changes and modifications may be made within the scope of the appended claims.