METHOD FOR GENERATING A MAP OF AN AREA TO BE USED FOR GUIDING A VEHICLE
20220333949 · 2022-10-20
Assignee
Inventors
Cpc classification
B62D15/0285
PERFORMING OPERATIONS; TRANSPORTING
B62D15/027
PERFORMING OPERATIONS; TRANSPORTING
G05D1/0214
PHYSICS
International classification
Abstract
A method for generating a map of an area to be used for guiding a vehicle in the area by use of at least one vehicle guiding sensor, the method comprising obtaining sensor accuracy data comprising information about vehicle guiding sensor accuracy problems which are associated with at least one vehicle position of a vehicle when the vehicle is guided in the area; based on the obtained sensor accuracy data, defining one or more low sensor accuracy zones relating to the vehicle position/s at which the vehicle guiding sensor accuracy problems were recognized; and generating the map based on the defined one or more low sensor accuracy zones.
Claims
1. A method for generating a map of an area to be used for guiding a vehicle in the area by use of at least one vehicle guiding sensor, the method comprising: obtaining sensor accuracy data comprising information about vehicle guiding sensor accuracy problems which are associated with at least one vehicle position of a vehicle when the vehicle is guided in the area; based on the obtained sensor accuracy data, defining one or more low sensor accuracy zones relating to the vehicle position/s at which the vehicle guiding sensor accuracy problems were recognized; and generating the map based on the defined one or more low sensor accuracy zones.
2. The method according to claim 1, wherein the information about vehicle guiding sensor accuracy problems are further associated with at least one of a vehicle state, an environmental condition and a time of day when the vehicle is guided in the area.
3. The method according to claim 1, wherein the sensor accuracy data is obtained from on-board vehicle guiding sensors of a plurality of vehicles which are guided in the area.
4. The method according to claim 1, wherein the sensor accuracy data is obtained from at least one off-board vehicle guiding sensor.
5. The method according to claim 1, wherein at least one of the one or more low sensor accuracy zones is defined by determining at least one entry point for the low sensor accuracy zone, which at least one entry point defines a limit between the low sensor accuracy zone and a high sensor accuracy zone outside of the low sensor accuracy zone.
6. The method according to claim 1, wherein the map is further generated by defining at least one high sensor accuracy zone outside of the one or more low sensor accuracy zones.
7. The method according to claim 1, wherein the sensor accuracy data relates to at least one of GNSS (Global Navigation Satellite System) data accuracy problems, articulation angle measurement problems and environmental perception sensor problems.
8. The method according to claim 1, wherein the vehicle guiding sensor accuracy problems are defined by at least one predetermined threshold value.
9. A control unit for generating a map of an area to be used for guiding a vehicle in the area by use of at least one vehicle guiding sensor, wherein the control unit is configured to perform the steps of the method according to claim 1.
10. A method for guiding a vehicle in an area, comprising: guiding the vehicle in the area by use of a map generated according to claims 1, and issuing a warning signal when the vehicle is entering or approaching at least one of the one or more low sensor accuracy zones.
11. A control unit for guiding a vehicle in an area, wherein the control unit is configured to perform the steps of the method according to claim 10.
12. A vehicle, wherein the vehicle is configured to be guided by use of the control unit according to claim 11 and by at least one vehicle guiding sensor.
13. The vehicle according to claim 12, wherein the vehicle is an articulated vehicle combination comprising at least one articulation joint.
14. A computer program comprising instructions for causing a control unit to perform the steps of the method according to claim 1.
15. A computer readable medium comprising instructions for causing a control unit to perform the steps of the method according to claim 1.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0033] With reference to the appended drawings, below follows a more detailed description of embodiments of the invention cited as examples.
[0034] In the drawings:
[0035]
[0036]
[0037]
[0038]
DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS OF THE INVENTION
[0039]
[0040]
[0041] With reference to especially
[0042] The method according to the first aspect is a method for generating a map of an area A to be used for guiding a vehicle 1 in the area A by use of at least one vehicle guiding sensor 2, 3. The method comprises:
[0043] S1: obtaining sensor accuracy data comprising information about vehicle guiding sensor accuracy problems which are associated with at least one vehicle position of a vehicle 1 when the vehicle 1 is guided in the area A. For example, when the vehicle 1 is guided in the area A along a travelling path T by use of e.g. the vehicle guiding sensor 2, the vehicle 1 may experience a sensor accuracy problem for the vehicle guiding sensor 2 when entering a zone Z.sub.2. The sensor accuracy problem may for example be that the camera 2 is obstructed by sun glare in the zone Z.sub.2. A vehicle position may for example be defined by coordinates in a coordinate system.
[0044] The method further comprises:
[0045] S2: based on the obtained sensor accuracy data, defining one or more low sensor accuracy zones Z.sub.1, Z.sub.2, Z.sub.3 relating to the vehicle position/s at which the vehicle guiding sensor accuracy problems were recognized; and
[0046] S3: generating the map based on the defined one or more low sensor accuracy zones Z.sub.1, Z.sub.2, Z.sub.3.
[0047] Accordingly, when e.g. the vehicle 1, or any other vehicle (not shown), is guided in the area A, it may experience sensor accuracy problems of one or more vehicle guiding sensors, which problems are associated with a position of the vehicle in the area A. This data is then used for defining one or more low sensor accuracy zones Z.sub.1, Z.sub.2, Z.sub.3 which relate to the vehicle positions at which the vehicle guiding sensor accuracy problems were recognized. For example, in addition to the above problem related to zone Z.sub.2, the zones Z.sub.1 and Z.sub.3 may relate to GNSS sensor accuracy problems, e.g. in these zones the position information fluctuates too much, or no position information is obtained at all. The fluctuations may reach or exceed a predetermined threshold value, above which threshold value the fluctuations will negatively affect the vehicle guiding.
[0048] The information about vehicle guiding sensor accuracy problems for any one of the zones Z.sub.1, Z.sub.2, Z.sub.3 may further be associated with at least one of a vehicle state, an environmental condition and a time of day when the vehicle is guided in the area A. For example, the above-mentioned sun glare in zone Z.sub.2 may be experienced at a specific time of day and/or during specific weather conditions, e.g. during daytime a sunny day. In addition, the sun glare for e.g. the camera 2 may only be related to a specific orientation of the vehicle 1, i.e. at a specific vehicle state. Thereby, by also associating this information with the vehicle positions, a further improved map may be generated. In particular, when guiding a vehicle 1 by use of the generated map along e.g. the travelling path T, it can be determined in advance if, when and where a sensor accuracy problem may occur for e.g. the vehicle guiding sensor 2. Accordingly, this information can be used for recognizing sensor accuracy problems for any vehicle guiding sensor. Thereby, the generated map may be used for guiding a vehicle 1 by similar and/or by the same type of vehicle guiding sensors which are associated with the generated map. According to an example embodiment, the vehicle 1 which is guided by use of the map comprises at least one vehicle guiding sensor which is of the same type, or of a similar type, and which also is mounted at the same, or at a similar, position, as the vehicle guiding sensor(s) which experienced the sensor accuracy problem when the map was generated.
[0049] The sensor accuracy data may be obtained from on-board vehicle guiding sensors 2 of for example a plurality of vehicles which are guided in the area A, and/or the sensor accuracy data may be obtained from at least one off-board vehicle guiding sensor 3 when one or more vehicles are guided in the area A. As such, by recording sensor accuracy data—by the aforementioned vehicle guiding sensors 2, 3, and/or by any other vehicle guiding sensors—when several vehicles are guided in the area A, a more reliable map can be generated.
[0050] Furthermore, at least one of the one or more low sensor accuracy zones Z.sub.1, Z.sub.2, Z.sub.3 may be defined by determining at least one entry point Z.sub.1e, Z.sub.2e, Z.sub.3e for the low sensor accuracy zone, which at least one entry point Z.sub.1e, Z.sub.2e, Z.sub.3e defines a limit between the low sensor accuracy zone Z.sub.1, Z.sub.2, Z.sub.3 and a high sensor accuracy zone Z.sub.HA outside of the low sensor accuracy zone Z.sub.1, Z.sub.2, Z.sub.3. In the
[0051] The map may further be generated by defining at least one high sensor accuracy zone Z.sub.HA outside of the one or more low sensor accuracy zones Z.sub.1, Z.sub.2, Z.sub.3. This zone may for example be defined by subtracting the zones Z.sub.1, Z.sub.2, Z.sub.3 from the area A.
[0052] The area A may be any type of area, including but not limited to a terminal area, a harbour, a confined area, a street, a part of a city or the like.
[0053] The sensor accuracy data may as mentioned relate to at least one of GNSS (Global Navigation Satellite System) data accuracy problems, articulation angle measurement problems and environmental perception sensor problems. An environmental perception sensor may for example be a LIDAR, RADAR, SONAR, camera, and/or ultrasonic sensor.
[0054] The vehicle guiding sensor accuracy problems may be defined by at least one predetermined threshold value, as mentioned in the above. For example, a predetermined threshold value for a GNSS sensor, a LIDAR sensor or for any other vehicle guiding sensor may be determined, which value is indicative of a sensor accuracy problem.
[0055] When e.g. a camera or the like is used for determining an articulation angle of an articulated vehicle combination 1, the articulation angle may be determined by matching an image obtained by the camera with a stored image. The stored image may have been obtained during a calibration operation for the camera. If there is a difficulty in finding a match, this may be an indication of a sensor accuracy problem. For example, sun glare may result in that the obtained image cannot find a match with a stored image from the calibration operation.
[0056] Other non-limiting examples of sensor accuracy problems which may be identified and used for defining the low sensor accuracy zones Z.sub.1, Z.sub.2, Z.sub.3 are:
[0057] GNSS triangulation errors which exceed a predetermined threshold value,
[0058] GNSS position and/or heading which fluctuates,
[0059] number of available GNSS satellites is suddenly decreasing, and
[0060] articulation angle value which fluctuates.
[0061] A control unit 10 of e.g. the vehicle 1 may be used for generating the map of the area A. It shall however be understood that the control unit may be any type of control unit, also including off-board control units or computers.
[0062] The generated map may be generated over time by one or more vehicles which are guided in the area A. Thereby the map may also be updated over time, taking e.g. new circumstances into consideration. For example, a GNSS problem may occur after a new building has been built in the area A, and by continually updating the map, such modifications to the area A may be considered and included in the updated map.
[0063] With reference to
[0064] S10: guiding the vehicle 1 in the area A by use of a map generated according to any one of the embodiments of the first aspect of the invention, and
[0065] S20: issuing a warning signal when the vehicle 1 is entering or approaching at least one of the one or more low sensor accuracy zones Z.sub.1, Z.sub.2, Z.sub.3.
[0066] For example, as shown in
[0067] The vehicle guiding sensors 3 are preferably connected to a control unit (not shown), or computer, which in turn is communicatively connected with e.g. the control unit 10 of the vehicle 1. The communication is in such an embodiment preferably wireless, such as via WiFi, Bluetooth, 3g, 4g, 5g or the like.
[0068] The control unit 10 is preferably performing the vehicle guiding method according to the third aspect, by e.g. issuing appropriate signals to steering actuators of the vehicle 1 such that any one of the travelling paths T or T′ is followed.
[0069] The above-mentioned methods may be implemented as one or more computer programs which are run by e.g. the control unit 10 and/or as a computer readable medium comprising instructions for causing e.g. the control unit 10 to perform any one of the methods.
[0070] The vehicles 1 in
[0071] The control units as disclosed herein may include a microprocessor, a microcontroller, a programmable digital signal processor or another programmable device. Thus, the control unit 10 may comprise electronic circuits and connections (not shown) as well as processing circuitry (not shown) such that the control unit 10 can communicate with different parts of the vehicle 1 or with different control units of the vehicle 1, such as with various sensors, systems and control units, in particular with one or more electronic control units (ECUs) controlling electrical systems or subsystems in the vehicle 1. The control unit 10 may comprise modules in either hardware or software, or partially in hardware or software, and communicate using known transmission buses such a CAN-bus and/or wireless communication capabilities. The processing circuitry may be a general-purpose processor or a specific processor. The control unit 10 may comprise a non-transitory memory for storing computer program code and data. Thus, the skilled person realizes that the control unit 10 may be embodied by many different constructions.
[0072] It is to be understood that the present invention is not limited to the embodiments described above and illustrated in the drawings; rather, the skilled person will recognize that many changes and modifications may be made within the scope of the appended claims.