MITIGATING COLLISION RISK WITH AN OBSCURED OBJECT

20210370927 · 2021-12-02

    Inventors

    Cpc classification

    International classification

    Abstract

    A method for reducing a risk of collision with an obscured object. The method includes detecting, by a detector, the object in proximity of the detector, and determining, by the detector, object data of the detected object in response to the detection. The method further includes determining, by the detector, any vehicle in motion in a direction towards an area where the detected object is or may be present, and transmitting, by the detector, the determined object data of the detected object to the vehicle. The method furthermore includes receiving, by the vehicle, the transmitted object data, processing, by the vehicle, the received object data to assess a collision risk based on a predicted route of the vehicle, and mitigating the collision risk, in the vehicle, based on the processed object data when the collision risk has been assessed.

    Claims

    1. A method for reducing a risk of collision with an obscured object, comprising the steps of: detecting, by a detector, the object in proximity of the detector, determining, by the detector, object data of the detected object in response to the detection, wherein the object data comprises non-image information, determining, by the detector, any vehicle in motion in a direction towards an area where the detected object is or may be present, transmitting, by the detector, the determined object data of the detected object to the vehicle, receiving, by the vehicle, the transmitted object data, processing, by the vehicle, the received object data to assess a collision risk based on a predicted route of the vehicle, and mitigating the collision risk, in the vehicle, based on the processed object data when the collision risk has been assessed.

    2. The method according to claim 1, further comprising the step of: discarding, by the vehicle, the received object data when the collision risk has been ruled out.

    3. The method according to claim 1, wherein the mitigating of the collision risk further comprises the step of: warning, in the vehicle, about the detected object and/or displaying, in the vehicle, a representation of the detected object based on the processed object data when the collision risk has been assessed.

    4. The method according to claim 1, wherein the mitigating of the collision risk further comprises the step of: automatically re-routing or braking or adjusting a speed of the vehicle.

    5. The method according to claim 1, wherein the object data comprises any one or all of position data, velocity vector data, and object type data.

    6. The method according to claim 1, wherein the detector comprises a sensor arranged on any one of a moving vehicle, a non-moving vehicle, a building, a wall, a lamp-post, and road-side unit.

    7. The method according to claim 1, wherein the detecting of the object comprises detecting by any one of camera detection, radar detection, LIDAR detection, GPS detection, and wearable device detection.

    8. The method according to claim 1, wherein the processing of the received object data comprises the step of: determining a position of the detected object in relation to the vehicle based on the received object data.

    9. The method according to claim 1, wherein the processing of the received object data further comprises the step of: determining a speed and direction of the detected object in relation to the vehicle based on the received object data.

    10. A non-transitory computer readable medium, having stored thereon a computer program comprising program instructions, the computer program being loadable into a data processing unit and configured to cause execution of the method steps according to claim 1 when the computer program is run by the data processing unit.

    11. An arrangement for reducing a risk of collision with an obscured object, comprising: a memory comprising executable instructions, one or more processors configured to communicate with the memory wherein the one or more processors are configured to cause the arrangement to: detect the object in proximity of a detector, determine object data of the detected object in response to the detection, wherein the object data comprises non-image information, determine any vehicle in motion in a direction towards an area where the detected object is or may be present, and transmit the determined object data of the detected object to the vehicle.

    12. The arrangement according to claim 11, wherein the arrangement comprises a sensor arranged on any one of a moving vehicle, a non-moving vehicle, a building, a wall, a lamp-post, and road-side unit.

    13. An arrangement for reducing a risk of collision with an obscured object, comprising: a memory comprising executable instructions, one or more processors configured to communicate with the memory wherein the one or more processors are configured to cause the arrangement to: receive object data of the object, wherein the object data comprises non-image information, process the received object data to assess the collision risk based on a predicted route, and mitigate the collision risk based on the processed object data when the collision risk has been assessed.

    14. The arrangement according to claim 13, wherein the one or more processors are configured to further cause the arrangement to: discard the received object data when the collision risk has been ruled out.

    15. The arrangement according to claim 13, wherein the mitigation of the collision risk comprises warning about the detected object and/or displaying a representation of the detected object based on the processed object data when the collision risk has been assessed.

    16. The arrangement according to claim 13, wherein the mitigation of the collision risk comprises automatically re-routing or braking or adjusting a speed of the vehicle.

    17. A vehicle comprising the arrangement according to claim 13.

    18. A system for reducing a risk of collision with an obscured object, comprising: a detecting module configured to detect an object in proximity of a detector, a determining module configured to determine object data of the detected object in response to the detection, wherein the object data comprises non-image information, a determining module configured to determine any vehicle in motion in a direction towards an area where the detected object is or may be present, a transmitting module configured to transmit the determined object data of the detected object to the vehicle, a receiving module configured to receive the transmitted object data, a processing module configured to process the received object data to assess the collision risk based on a predicted route of the vehicle, and a risk mitigation module configured to mitigate the collision risk based on the processed object data when the collision risk has been assessed.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0068] Further objects, features and advantages will appear from the following detailed description of embodiments, with reference being made to the accompanying drawings. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating the example embodiments.

    [0069] FIG. 1 is a flowchart illustrating example method steps according to some embodiments;

    [0070] FIG. 2 is a schematic drawing illustrating an example environment according to some embodiments;

    [0071] FIG. 3 is a schematic drawing illustrating an example environment according to some embodiments;

    [0072] FIG. 4 is a schematic block diagram illustrating an example arrangement according to some embodiments; and

    [0073] FIG. 5 is a schematic drawing illustrating an example computer readable medium according to some embodiments.

    DETAILED DESCRIPTION

    [0074] As already mentioned above, it should be emphasized that the term “comprises/comprising” when used in this specification is taken to specify the presence of stated features, integers, steps, or components, but does not preclude the presence or addition of one or more other features, integers, steps, components, or groups thereof. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise.

    [0075] Embodiments of the present disclosure will be described and exemplified more fully hereinafter with reference to the accompanying drawings. The solutions disclosed herein can, however, be realized in many different forms and should not be construed as being limited to the embodiments set forth herein.

    [0076] In the following, embodiments will be described where alternative approaches to reducing a risk of collision with an obscured object.

    [0077] FIG. 1 is a flowchart illustrating example method steps according to some embodiments. The collision risk reducing method 100 is for reducing a risk of collision with an obscured object. Thus, the collision risk reducing method 100 may, for example, be performed by the collision risk reducing arrangement of FIG. 4 for reducing a risk of collision with an obscured object as illustrated in example environments in FIGS. 2 and 3.

    [0078] The collision risk reducing method 100 comprises following steps.

    [0079] In step 101, an object in proximity of a detector is detected by a detector.

    [0080] As an example, the object may be a pedestrian e.g. an adult or a child, or a cyclist, or an animal etc.

    [0081] As an example, a detector may comprise a sensor arranged on any one of a moving vehicle, a non-moving vehicle, a building, a wall, a lamp-post, and road-side unit.

    [0082] As an example, the sensor may be configured to detect the object by any one of camera detection, radar detection, LIDAR detection, GPS detection, and wearable device detection e.g. via 3G/4G/5G or a GPS location provided by a wearable device such as a smart phone.

    [0083] As an example, in proximity may comprise a distance of 10 meters.

    [0084] In step 102, object data of the detected object is determined by the detector in response to the detection.

    [0085] As an example, object data may comprise any one or all of position data, velocity vector data, and object type data. This kind of object data may be regarded as non-image information as only parameters relating to the detected object are determined and not the whole image e.g. surroundings around the detected object are discarded as it is not relevant from a detection perspective.

    [0086] Position data may comprise coordinates of where the detected object is at the time of the detection.

    [0087] Velocity vector data may comprise direction and speed of the detected object at the time of the detection.

    [0088] Object type data may comprise data defining the detected object at the time of the detection e.g. if the detected object is a pedestrian and/or if the detected pedestrian is an adult or a child, or if the detected object is a cyclist, or if the detected object is an animal etc.

    [0089] Based on the position data, velocity vector data and the object type data a position, type and movement of the detected object may be determined.

    [0090] Hence, based on the object data of the detected object a position, speed and direction of the detected object may be determined in relation to the vehicle.

    [0091] In step 103, any vehicle in motion in a direction towards an area where the detected object is or may be present is determined by the detector.

    [0092] As an example, the detector may be able to detect a vehicle in motion towards an area where the detected object is or may be present from a distance of e.g. 50-100 meters.

    [0093] In step 104, the determined object data of the detected object is transmitted by the detector to the vehicle.

    [0094] As only the object data which may be regarded as non-image information is transmitted, i.e. in this way avoiding to transmit a whole image, a more efficient, i.e. by avoiding to overload the communication channels with redundant information, and hardware agnostic, i.e. as the object data is compatible with any type of hardware, method is achieved.

    [0095] As an example, the transmission of object data may be enabled via Vehicle-to-Vehicle (V2V) communication wherein the V2V communication enables an exchange of signals, i.e. information, between V2V communication enabled vehicles.

    [0096] As another example, the transmission of the object data may be enabled via Vehicle-to-Infrastructure (V2I) communication wherein the V2I communication enables an exchange of signals, i.e. information, between V2I communication enabled vehicle and infrastructure.

    [0097] As a further example, the transmission of the object data may be enabled via Vehicle-to-Anything (V2X) communication wherein the V2X communication enables an exchange of signals, i.e. information, between V2X communication enabled vehicle and other device.

    [0098] Moreover, the transmission of the object data may be enabled via 3G/4G/5G enabled communication technology or any other communication technology.

    [0099] In step 105, the transmitted object data is received by the vehicle.

    [0100] As an example, the reception of object data may be enabled via V2V or V2I or V2X communication or via 3G/4G/5G enabled communication technology or any other communication technology according to above transmission examples.

    [0101] In step 106, the received object data is processed to assess a collision risk based on a predicted route of the vehicle by the vehicle.

    [0102] As an example, the processing of the object data may comprise data analysis in order to determine if the properties of the detected object may result in that the detected object overlaps with the predicted route of the vehicle.

    [0103] In optional step 107, the received object data is discarded when the collision risk has been ruled out by the vehicle.

    [0104] In step 108, the collision risk is mitigated based on the processed object data when the collision risk has been assessed in the vehicle.

    [0105] As an example, the vehicle may determine that the collision risk must be mitigated through an action of the vehicle.

    [0106] In optional step 109, the mitigation comprises warning about the detected object in the vehicle.

    [0107] As an example, the warning may comprise a warning sound which increases the closer the vehicle gets to the detected object.

    [0108] As an example, the warning of the collision risk may be executed by a security system in the vehicle.

    [0109] In optional step 110, a representation of the detected object is displayed based on the processed object data when the collision risk has been assessed in the vehicle.

    [0110] As an example, the representation may comprise a highlighted shape e.g. a rectangle displayed on a heads-up display (HUD) in the vehicle at the position where the detected object is expected to be present based on the object data or a pre-determined figure chosen from a plurality of pre-determined figures displayed on the HUD based on the object data.

    [0111] In optional step 111, the mitigation comprises automatically re-routing or braking or adjusting a speed of the vehicle.

    [0112] As an example, the automatic actions in response to the collision risk may be executed by a security system in the vehicle.

    [0113] FIG. 2 is a schematic drawing illustrating an example environment according to some embodiments. The example environment 200 illustrates reducing a risk of collision with an obscured object. Thus, the collision risk reducing method 100 may, for example, be performed by the collision risk reducing arrangement 410a,410b of FIG. 4 for reducing a risk of collision with an obscured object in an example environment 200 as in FIG. 2.

    [0114] FIG. 2 illustrates an environment 200 comprising a first vehicle 201 wherein the first vehicle 201 comprises an arrangement for reducing a risk of collision with an obscured object comprising a memory comprising executable instructions and one or more processors configured to communicate with the memory.

    [0115] The one or more processors are configured to cause the arrangement in the first vehicle 201 to detect the objects 203,204, e.g. obscured pedestrians about to cross or already crossed the street, in proximity of a detector comprised in the first vehicle 201 and/or a lamp-post 207 and/or a detector 208 mounted on a wall 205,206, and determine object data of the detected objects 203,204 in response to the detection.

    [0116] The one or more processors are further configured to cause the arrangement to determine any vehicle in motion, e.g. a second vehicle 202, in a direction towards an area where the detected object 203,204 is or may be present, and transmit the determined object data of the detected objects 203,204 to the second vehicle 202.

    [0117] In some embodiments, the detector comprises a sensor 208 arranged on any one of a moving vehicle 202, a non-moving vehicle (not shown), a building (not shown), a wall 205,206, a lamp-post 207, and road-side unit (not shown).

    [0118] FIG. 3 is a schematic drawing illustrating an example environment according to some embodiments. The example environment 300 illustrates reducing a risk of collision with an obscured object. Thus, the collision risk reducing method 100 may, for example, be performed by the collision risk reducing arrangement 410a,410b of FIG. 4 for reducing a risk of collision with an obscured object in an example environment 300 as in FIG. 3.

    [0119] FIG. 3 illustrates an environment 300 comprising a second vehicle 302 wherein the second vehicle 302 comprises an arrangement for reducing a risk of collision with an obscured object comprising a memory comprising executable instructions and one or more processors configured to communicate with the memory.

    [0120] The one or more processors are configured to cause the arrangement in the second vehicle 302 to receive object data of the object 303, e.g. obscured pedestrian about to cross the street, and process the received object data to assess the collision risk based on a predicted route.

    [0121] The one or more processors are further configured to cause the arrangement to mitigate the collision risk based on the processed object data when the collision risk has been assessed.

    [0122] In some embodiments, the one or more processors are further configured to cause the arrangement to discard the received object data when the collision risk has been ruled out.

    [0123] In some embodiments, the detector comprises a sensor 308 arranged on any one of a moving vehicle 301, a non-moving vehicle (not shown), a building (not shown), a wall 305, a lamp-post 307, and road-side unit (not shown).

    [0124] The sensor 308 may be mounted on a wall 305 and positioned such that the angle of the sensor 308 covers the most critical parts of the intersection so that any blind spots are minimized.

    [0125] In some embodiments, the mitigation of the collision risk comprises warning about the detected object and/or displaying a representation of the detected object in the second vehicle 302 based on the processed object data when the collision risk has been assessed.

    [0126] In some embodiments, the mitigation of the collision risk comprises automatically re-routing or braking or adjusting a speed of the second vehicle 302.

    [0127] FIG. 4 is a schematic block diagram illustrating an example arrangement according to some embodiments. The example arrangement is a collision risk reducing arrangement 410a,410b for reducing a risk of collision with an obscured object. Thus, the collision risk reducing method 100 may, for example, be performed by the collision risk reducing arrangement 410a,410b of FIG. 4 for reducing a risk of collision with an obscured object as illustrated in example environments 200,300 in FIGS. 2 and 3.

    [0128] The collision risk reducing arrangement 410a comprises controlling circuitry CNTR 400a, which may in turn comprise a detecting module DETC 401, e.g. detecting circuitry, configured to detect an object in proximity of a detector, a determining module DETR 402, e.g. determining circuitry, configured to determine object data of the detected object in response to the detection, a determining module DETR 403, e.g. determining circuitry, configured to determine any vehicle in motion in a direction towards an area where the detected object is or may be present, and a transmitting module TX 404, e.g. transmitting circuitry, configured to transmit the determined object data of the detected object to the vehicle.

    [0129] The collision risk reducing arrangement 410a may be comprised in a detector.

    [0130] The collision risk reducing arrangement 410b comprises controlling circuitry CNTR 400b, which may in turn comprise a receiving module RX 405, e.g. receiving circuitry, configured to receive the transmitted object data, a processing module PROC 406, e.g. processing circuitry, configured to process the received object data to assess the collision risk based on a predicted route of the vehicle, and a risk mitigation module MIT 408, e.g. risk mitigating circuitry, configured to mitigate the collision risk based on the processed object data when the collision risk has been assessed.

    [0131] The collision risk reducing arrangement 410b may be comprised in a vehicle.

    [0132] The collision risk reducing arrangements 410a and 410b may be wirelessly connected and communicate over e.g. V2V, V2I, V2X communication or 3G/4G/5G enabled communication technology or any other communication technology.

    [0133] In some embodiments, the controlling circuitry CNTR 400b may further comprise a discarding arrangement DISC 407, e.g. discarding circuitry, configured to discard the received object data when the collision risk has been ruled out.

    [0134] In some embodiments, the controlling circuitry CNTR 400b may further comprise a warning arrangement WARN 409, e.g. warning circuitry, configured to warn about the detected object.

    [0135] In some embodiments, the controlling circuitry CNTR 400b may further comprise a display arrangement DISPL 410, e.g. displaying circuitry, configured to display a representation of the detected object.

    [0136] The collision risk reducing arrangements 410a,410b may be configured to perform method steps of any of the methods described in connection with FIG. 1.

    [0137] FIG. 5 is a schematic drawing illustrating an example computer readable medium according to some embodiments. The computer program product comprises a non-transitory computer readable medium 500 having thereon a computer program 510 comprising program instructions, wherein the computer program being loadable into a data processing unit and configured to cause execution of the method steps of any of the methods described in connection with FIG. 1.

    [0138] Generally, when an arrangement is referred to herein, it is to be understood as a physical product; e.g., an apparatus. The physical product may comprise one or more parts, such as controlling circuitry in the form of one or more controllers, one or more processors, or the like.

    [0139] The described embodiments and their equivalents may be realized in software or hardware or a combination thereof. The embodiments may be performed by general purpose circuitry. Examples of general purpose circuitry include digital signal processors (DSP), central processing units (CPU), co-processor units, field programmable gate arrays (FPGA) and other programmable hardware. Alternatively or additionally, the embodiments may be performed by specialized circuitry, such as application specific integrated circuits (ASIC). The general purpose circuitry and/or the specialized circuitry may, for example, be associated with or comprised in an apparatus such as a vehicle.

    [0140] Embodiments may appear within an electronic apparatus (associated with or comprised in a vehicle) comprising arrangements, circuitry, and/or logic according to any of the embodiments described herein. Alternatively or additionally, an electronic apparatus (associated with or comprised in a vehicle) may be configured to perform methods according to any of the embodiments described herein.

    [0141] According to some embodiments, a computer program product comprises a computer readable medium such as, for example a universal serial bus (USB) memory, a plug-in card, an embedded drive or a read only memory (ROM). FIG. 5 illustrates an example computer readable medium in the form of a compact disc (CD) ROM 500. The computer readable medium has stored thereon a computer program comprising program instructions. The computer program is loadable into a data processor (PROC) 520, which may, for example, be comprised in an apparatus or vehicle 510. When loaded into the data processing unit, the computer program may be stored in a memory (MEM) 530 associated with or comprised in the data-processing unit. According to some embodiments, the computer program may, when loaded into and run by the data processing unit, cause execution of method steps according to, for example, any of the methods illustrated in FIG. 1 or otherwise described herein.

    [0142] Generally, all terms used herein are to be interpreted according to their ordinary meaning in the relevant technical field, unless a different meaning is clearly given and/or is implied from the context in which it is used.

    [0143] Reference has been made herein to various embodiments. However, a person skilled in the art would recognize numerous variations to the described embodiments that would still fall within the scope of the claims.

    [0144] For example, the method embodiments described herein discloses example methods through steps being performed in a certain order. However, it is recognized that these sequences of events may take place in another order without departing from the scope of the claims. Furthermore, some method steps may be performed in parallel even though they have been described as being performed in sequence. Thus, the steps of any methods disclosed herein do not have to be performed in the exact order disclosed, unless a step is explicitly described as following or preceding another step and/or where it is implicit that a step must follow or precede another step.

    [0145] In the same manner, it should be noted that in the description of embodiments, the partition of functional blocks into particular units is by no means intended as limiting. Contrarily, these partitions are merely examples. Functional blocks described herein as one unit may be split into two or more units. Furthermore, functional blocks described herein as being implemented as two or more units may be merged into fewer (e.g. a single) unit.

    [0146] Any feature of any of the embodiments disclosed herein may be applied to any other embodiment, wherever suitable. Likewise, any advantage of any of the embodiments may apply to any other embodiments, and vice versa.

    [0147] Hence, it should be understood that the details of the described embodiments are merely examples brought forward for illustrative purposes, and that all variations that fall within the scope of the claims are intended to be embraced therein.