VEHICLE REMINDING METHOD AND SYSTEM, AND RELATED DEVICE
20230222914 · 2023-07-13
Inventors
Cpc classification
B60Q5/008
PERFORMING OPERATIONS; TRANSPORTING
B60Q1/525
PERFORMING OPERATIONS; TRANSPORTING
International classification
B60Q9/00
PERFORMING OPERATIONS; TRANSPORTING
Abstract
A vehicle reminding method and system, and a related device are provided. The method includes: obtaining detection information, where the detection information includes related information that is of a target vehicle and that is collected by a sensor of an ego vehicle; performing collision analysis based on the detection information; and reminding the target vehicle of a collision risk.
Claims
1. A vehicle reminding method, wherein the method comprises: obtaining detection information, wherein the detection information comprises related information that is of a target vehicle and that is collected by a sensor; performing collision analysis based on the detection information; and reminding the target vehicle of a collision risk determined based on the collision analysis.
2. The method according to claim 1, wherein the detection information further comprises related information that is of an ego vehicle and that is collected by the sensor.
3. The method according to claim 1, wherein the related information of the target vehicle comprises an acceleration value of the target vehicle, and before the performing the collision analysis based on the detection information, the method further comprises: determining whether the acceleration value of the target vehicle is less than a first threshold; and performing the collision analysis based on the related information of the target vehicle when the acceleration value of the target vehicle is less than the first threshold.
4. The method according to claim 1, wherein the performing the collision analysis based on the detection information comprises: performing the collision analysis based on the detection information to determine a collision risk level, wherein the collision risk level is one of a low risk, a medium risk, or a high risk.
5. The method according to claim 3, wherein the reminding the target vehicle of the collision risk comprises: reminding the target vehicle by controlling a sound and/or light when the collision risk level is determined.
6. The method according to claim 3, wherein the reminding the target vehicle of the collision risk comprises: sending a vehicle to everything (V2X) message to the target vehicle to remind the target vehicle of the collision risk level.
7. A vehicle reminding apparatus, comprising: one or more processors, and a memory coupled to the one or more processors and storing program instructions, which, when executed by the one or more processors, cause the apparatus to: receive detection information, wherein the detection information comprises related information that is of a target vehicle and that is collected by a sensor; process the detection information; perform collision analysis based on the detection information; and remind the target vehicle of a potential risk based on the collision analysis.
8. The apparatus according to claim 7, wherein the program instructions further cause the apparatus to receive related information of an ego vehicle.
9. The apparatus according to claim 7, wherein the program instructions further cause the apparatus to: determine whether an acceleration value of the target vehicle is less than a first threshold before performing collision analysis; and perform the collision analysis based on the related information of the target vehicle when the acceleration value of the target vehicle is less than the first threshold.
10. The apparatus according to claim 7, wherein the program instructions further cause the apparatus to: determine a collision risk level after performing collision analysis, wherein the collision risk level is one of a low risk, a medium risk, or a high risk.
11. The apparatus according to claim 10, wherein the program instructions further cause the apparatus to: remind the target vehicle by controlling a sound and/or light when the collision risk level is determined.
12. The apparatus according to claim 10, wherein the program instructions further cause the apparatus to: send a vehicle to everything (V2X) message to the target vehicle to remind the target vehicle when the collision risk level is determined.
13. A non-transitory computer-readable medium storing program instructions, which, when executed by one or more processors of a vehicle reminding apparatus, cause the apparatus to: receive detection information, wherein the detection information comprises related information that is of a target vehicle and that is collected by a sensor; process the detection information; perform collision analysis based on the detection information; and remind the target vehicle of a potential risk based on the collision analysis.
14. The non-transitory computer-readable medium according to claim 13, wherein the program instructions further cause the apparatus to receive related information of an ego vehicle.
15. The non-transitory computer-readable medium according to claim 13, wherein the program instructions further cause the apparatus to: determine whether an acceleration value of the target vehicle is less than a first threshold before performing collision analysis; and perform the collision analysis based on the related information of the target vehicle when the acceleration value of the target vehicle is less than the first threshold.
16. The non-transitory computer-readable medium according to claim 13, wherein the program instructions further cause the apparatus to: determine a collision risk level after performing collision analysis, wherein the collision risk level is one of a low risk, a medium risk, or a high risk.
17. The non-transitory computer-readable medium according to claim 16, wherein the program instructions further cause the apparatus to: remind the target vehicle by controlling a sound and/or light when the collision risk level is determined.
18. The non-transitory computer-readable medium according to claim 16, wherein the program instructions further cause the apparatus to: send a vehicle to everything (V2X) message to the target vehicle to remind the target vehicle when the collision risk level is determined.
Description
BRIEF DESCRIPTION OF DRAWINGS
[0035]
[0036]
[0037]
[0038]
[0039]
[0040]
[0041]
[0042]
[0043]
[0044]
[0045]
DESCRIPTION OF EMBODIMENTS
[0046] The following clearly describes technical solutions in embodiments of this application with reference to accompanying drawings. It is clear that the described embodiments are merely some but not all embodiments of this application. All other embodiments obtained by a person of ordinary skill in the art based on embodiments of this application without creative efforts shall fall within the protection scope of this application.
[0047] Some terms and related technologies in this application are first described, to facilitate understanding by a person skilled in the art.
[0048] A vehicle-to-everything (V2X) system is a communication system dedicated to connecting a vehicle to an ambient environment. Currently, there are two mainstream technologies used for V2X communication in the world. One is dedicated short range communications (DSRC), and the other is a C-V2X (cellular vehicle to everything) technology based on a cellular mobile communication system, including LTE-V2X and 5G NR-V2X. V2X is classified into four types based on different connection objects: a vehicle-to-vehicle (V2V) type, a vehicle-to-infrastructure (V2I) type, a vehicle-to-pedestrian (V2P) type, and a vehicle-to-network (V2N) type. The V2V indicates that direct communication may be performed between vehicles. A vehicle is used as a mobile communication terminal, and has a capability of receiving and sending basic vehicle body data.
[0049] An adaptive cruise control (ACC) system is developed based on a cruise control system. In addition to a function of the cruise control system, that is, a function of driving at a speed set by a driver, the adaptive cruise control system can also implement a function of maintaining a preset vehicle following distance and automatically accelerating and decelerating as the distance changes. Compared with the cruise control system, the ACC system can help a driver better coordinate a brake and a throttle.
[0050] A full-speed adaptive cruise control system is developed from the adaptive cruise control system. Compared with the adaptive cruise, the full-speed adaptive cruise has a larger operating range and can work between 0 km/h to 150 km/h.
[0051] A road side unit (RSU) is an apparatus installed on roadsides in an electronic toll collection (ETC) system, and communicates with an on-board unit (OBU) by using a dedicated short range communications (DSRC) technology, to realize vehicle identification and electronic score deduction. The RSU may include a high-gain directional beam-control read/write antenna and a radio frequency controller. The high-gain directional beam-control read/write antenna is a microwave transceiver module that transmits/receives, modulates/demodulates, encodes/decodes, and encrypts/decrypts data signals and data. The radio frequency controller is a module that controls data transmission and reception and receives information from an upper computer or sends information to an upper computer.
[0052] To facilitate understanding of embodiments of this application, the following first describes a vehicle reminding system architecture on which embodiments of this application are based.
[0053] It should be noted that, in the vehicle reminding system architecture shown in
[0054] Based on the vehicle reminding system architectures shown in
[0055] As shown in
[0056] The sensor module 210 is configured to obtain detection information, where the detection information includes related information of a target vehicle and related information of an ego vehicle that are collected by a sensor.
[0057] The fusion control module 220 is configured to: process the detection information, perform collision analysis, and make a control decision.
[0058] The reminding module 230 is configured to remind the target vehicle based on the control decision.
[0059] It should be noted that, as shown in
[0060] It may be understood that the structure of the vehicle reminding device in
[0061] Based on the foregoing vehicle driving control system architecture, an embodiment of this application provides an intelligent vehicle 300 applied to the foregoing vehicle reminding system architecture.
[0062] It should be noted that the intelligent vehicle 300 may be set to a fully intelligent driving mode, or may be set to a partially intelligent driving mode. It may be understood that, when the intelligent vehicle 300 is set to the fully intelligent driving mode, the intelligent vehicle 300 may perform a corresponding operation without interacting with a person. The operation includes but is not limited to acceleration, deceleration, and vehicle-following. When the intelligent vehicle is set to the partial intelligent driving mode, the intelligent vehicle 300 may automatically perform a corresponding operation, and a driver may further perform a corresponding operation on the intelligent vehicle, for example, determining a vehicle and an ambient environment of the vehicle, determining a possible behavior of at least one another vehicle in the ambient environment, and determining a confidence level corresponding to a possibility of performing the possible behavior by the another vehicle. Then, the driver controls the intelligent vehicle 300 based on the determined information.
[0063] The intelligent vehicle 300 may include various subsystems, for example, a travel system 310, a sensor system 320, a control system 330, one or more peripheral devices 340, a computer system 350, a power supply 360, and a user interface 370. Optionally, the intelligent vehicle 300 may include more or fewer subsystems, and each subsystem may include a plurality of elements. In addition, each of the subsystems and elements of the intelligent vehicle 300 may be interconnected in a plurality of ways, for example, may be interconnected in a wired or wireless manner.
[0064] The travel system 310 may include a component providing power to the intelligent vehicle 300. In an embodiment, the travel system 310 may include an engine 3110, an energy source 3120, a transmission apparatus 3130, and wheels/tires 3140. The engine 3110 may be an internal combustion engine, an electric motor, an air compression engine, or a combination of other types of engines, for example, a hybrid engine including a gasoline engine and an electric motor, or a hybrid engine including an internal combustion engine and an air compression engine. The engine 3110 converts the energy source 3120 into mechanical energy.
[0065] Examples of the energy source 3120 include gasoline, diesel, another oil-based fuel, propane, another compressed gas-based fuel, ethanol, a solar panel, a battery, and another power source. The energy source 3120 may also provide energy to another system of the intelligent vehicle 300.
[0066] The transmission apparatus 3130 may transmit mechanical power from the engine 3110 to the wheels 3140. The transmission apparatus 3130 may include a gearbox, a differential, and a drive shaft. In an embodiment, the transmission apparatus 3130 may further include another component, for example, a clutch. The drive shaft may include one or more shafts that may be coupled to one or more wheels 3140.
[0067] The sensor system 320 may include several sensors that can sense information about an ambient environment of the intelligent vehicle 300 and obtain information of the ego vehicle. For example, the sensor system 320 may include a positioning system 3210, an inertial measurement unit (IMU) 3220, a radar 3230, and a vision sensor 3240. The positioning system 3210 may include a GPS system, a BeiDou system, or another positioning system. The sensor system 320 may further include a sensor of an internal system of the monitored intelligent vehicle 300, for example, an in-vehicle air quality monitor, a fuel gauge, or an engine oil thermometer. Data obtained by these sensors can be used to detect an object and corresponding features of the object. The features include but are not limited to a location, a shape, a direction, and a speed. Such detection and recognition are significant for the intelligent vehicle 300 to safely perform subsequent operations.
[0068] The positioning system 3210 may be configured to determine a geographic location of the intelligent vehicle 300.
[0069] The IMU 3220 can sense a location and an orientation change of the intelligent vehicle 300 based on an inertial acceleration. In an embodiment, the IMU 3220 may be a combination of an accelerometer and a gyroscope. In this case, the IMU 3220 may be configured to measure a curvature of the intelligent vehicle 300.
[0070] The radar 3230 may sense an ambient environment of the intelligent vehicle 300 by using a wireless signal, and the ambient environment includes but is not limited to surrounding vehicles, infrastructure, and pedestrians. It may be understood that the radar 3230 may include but is not limited to a millimeter-wave radar and a lidar. In some embodiments, in addition to sensing the ambient environment, the radar 3230 may also be used to sense a motion status of an object in the environment.
[0071] The vision sensor 3240 may be configured to capture a plurality of images of the ambient environment of the intelligent vehicle 300. The vision sensor 3240 may include but is not limited to a static camera and a video camera.
[0072] The control system 330 may be configured to control operations of the intelligent vehicle 300 and components of the intelligent vehicle 300. The control system 330 may include a plurality of elements. In an embodiment, the control system 330 includes a steering system 3310, an actuator 3320, a brake unit 3330, a computer vision system 3340, a route control system 3350, and an obstacle avoidance system 3360.
[0073] The steering system 3310 may be operated to adjust a moving direction of the intelligent vehicle 300. For example, in an embodiment, the steering system 3310 may include a steering wheel system.
[0074] The actuator 3320 may be configured to control the engine 3110 and further control a speed of the intelligent vehicle 300. For example, in an embodiment, the actuator 3320 may include a throttle.
[0075] The brake unit 3330 may be configured to control the intelligent vehicle 300 to decelerate. The brake unit 3330 may use friction to reduce a rotational speed of the wheels 3140. In another embodiment, the brake unit 3330 may convert kinetic energy of the wheels 3140 into a current. Alternatively, the brake unit 3330 may adopt another method to reduce the rotational speed of the wheels 3140, to control the speed of the intelligent vehicle 300.
[0076] It may be understood that the actuator 3320 and the brake unit 3330 may be combined into one unit module, and the combined unit module may be configured to control the speed of the intelligent vehicle 300. In an embodiment, the combined unit module may include a throttle system and a brake system.
[0077] The computer vision system 3340 may be configured to process and analyze an image captured by the vision sensor 3240, to identify an ambient environment of the intelligent vehicle 300, and features and a motion status of an object in the ambient environment. The ambient environment may include a traffic signal, a road boundary, and an obstacle. Features of an object in the ambient environment include but are not limited to a surface optical feature of the object. The motion status includes but is not limited to a stationary state, acceleration, and deceleration. The computer vision system 3340 may use an object recognition algorithm, a structure from motion (SFM) algorithm, video tracking, and another computer vision technology. In some embodiments, the computer vision system 3340 includes an image detection system, a neural network-based processing system, and the like, and may be configured to: draw a map for an environment, track an object, estimate a speed of the object, and the like.
[0078] The route control system 3350 is configured to determine a driving route of the intelligent vehicle 300. In some embodiments, the route control system 3350 may combine data from one or more predetermined maps of the positioning system 3210 to determine the driving route of the intelligent vehicle 300.
[0079] The obstacle avoidance system 3360 is configured to: identify, evaluate, avoid, or bypass obstacles in the ambient environment. In an embodiment, the obstacle avoidance system 3360 needs to obtain information of the ambient environment by using the radar 3230 and the vision sensor 3240; the computer vision system 3340 is used to analyze the ambient environment and identify potential obstacles; and then the obstacle avoidance system 3360 performs evaluation and avoidance.
[0080] It should be noted that the control system 330 may add another component, or may replace and/or reduce the foregoing components.
[0081] The intelligent vehicle 300 interacts with an external sensor, another vehicle, another computer system, or a user by using the peripheral device 340. The peripheral device 340 may include but is not limited to a wireless communication system 3410, a vehicle-mounted computer 3420, a microphone 3430, and/or a speaker 3440.
[0082] It should be noted that, in some embodiments, the peripheral device 340 may interact with the user of the intelligent vehicle 300. For example, the vehicle-mounted computer 3420 may provide information for the user of the intelligent vehicle 300, and at the same time, the user of the intelligent vehicle 300 may also upload data to the vehicle-mounted computer 3420. It should be understood that the user of the intelligent vehicle 300 may perform operations by using a touchscreen of the vehicle-mounted computer 3420. In addition, the peripheral device 340 may provide a means for the intelligent vehicle 300 to communicate with another device in the vehicle. For example, the microphone 3430 may receive audio from the user of the intelligent vehicle 300. The audio may include voice commands and another audio input. Similarly, the speaker 3440 may output audio to the user of the intelligent vehicle 300.
[0083] The wireless communication system 3410 may wirelessly communicate with one or more devices directly or through a communication network. For example, the wireless communication system 3410 may use 3G cellular communication, such as CDMA, EVDO, GSM/GPRS, or 4G cellular communication, such as LTE, or 5G cellular communication. The wireless communication system 3410 may communicate with a wireless local area network (WLAN) by using Wi-Fi. In some embodiments, the wireless communication system 3410 may communicate directly with a device through an infrared link, Bluetooth, or ZigBee. The device may include but is not limited to a vehicle and/or a public facility between roadside stations.
[0084] It should be noted that, in this embodiment of this application, the ego vehicle and the target vehicle can communicate with each other by using V2X. Therefore, the wireless communication system 3410 may further include one or more DSRC devices and one or more LTE-V2X devices.
[0085] The power supply 360 may supply power to various components of the intelligent vehicle 300. In an embodiment, the power supply 360 may include one or more battery packs, and a battery in the battery pack may be a rechargeable lithium-ion battery or a lead-acid battery. It is understood that in some embodiments, the power supply 360 and the energy source 3120 may be implemented together.
[0086] Some or all of functions of the intelligent vehicle 300 are controlled by the computer system 350. The computer system 350 may include one or more processors 3520, the processor 3520 executes instructions 35110, and the instructions 35110 are stored in a non-transient computer-readable medium such as a memory 3510. The computer system 350 may alternatively be a plurality of computing devices that control individual components or subsystems of the intelligent vehicle 300 in a distributed manner.
[0087] The processor 3520 may be any conventional processor, for example, a commercially available CPU. Optionally, the processor may be a dedicated device such as an application-specific integrated circuit (ASIC) or another hardware-based processor. Although
[0088] In various aspects described herein, the processor may be located far away from the vehicle and communicate wirelessly with the vehicle. In another aspect, some processes described herein are performed on a processor disposed inside the vehicle, while others are performed by a remote processor, including taking operations necessary for single manipulation.
[0089] In some embodiments, the memory 3510 may include the instructions 35110 (for example, program logic), and the instructions 35110 may be executed by the processor 3520 to implement various functions including the foregoing functions of the intelligent vehicle 300. The memory 3510 may further include additional instructions, including instructions for sending send data to, receiving data from, interacting with, and/or controlling one or more of the travel system 310, the sensor system 320, the control system 330, and the peripheral device 340.
[0090] In addition to storing instructions 35110, the memory 3510 may further store data, such as a road map, route information, vehicle data such as a location, a direction, and a speed of a vehicle, and other related information. It may be understood that, in an embodiment, when the intelligent vehicle 300 is in an autonomous driving mode, a partially autonomous driving mode, and/or a manual driving mode, the computer system 350 of the intelligent vehicle 300 can perform a related operation by using the data. For example, the computer system 350 of the intelligent vehicle 300 may adjust a current speed of the intelligent vehicle based on road information of a target road section and a received speed range of the target vehicle, to enable the intelligent vehicle to follow the vehicle at a constant speed.
[0091] The user interface 370 is used to provide information for or receive information from the user of the intelligent vehicle 300. Optionally, the user interface 370 may include an interface required by one or more input/output devices in the peripheral device 340, for example, a USB interface, an AUX interface, or an OBD interface.
[0092] The computer system 350 may control functions of the intelligent vehicle 300 based on data of various subsystems (for example, the travel system 310, the sensor system 320, and the control system 330) and data received from the user interface 370. For example, the computer system 350 may control the steering system 3310 to avoid an obstacle detected by the sensor system 320 and the obstacle avoidance system 3360.
[0093] Optionally, the foregoing components may be assembled as subsystems inside the intelligent vehicle 300, and one or more of the foregoing components may be installed separately from the intelligent vehicle 300. For example, the memory 3510 may exist partially or entirely separated from the intelligent vehicle 300. The foregoing components may be coupled in a wired and/or wireless manner.
[0094] It should be noted that the foregoing modules and components in the modules may be added, replaced, or deleted based on an actual requirement. This is not limited in this application.
[0095] An intelligent driving vehicle traveling on a road, for example, the intelligent vehicle 300 shown in
[0096] Optionally, the intelligent vehicle 300 or the computing device associated with the intelligent vehicle 300 (such as the computer system 350, the computer vision system 3340, and the data storage apparatus 3510 in
[0097] In addition to providing an instruction for adjusting the speed of the intelligent vehicle 300, the computing device may further provide an instruction for modifying a steering angle of the intelligent vehicle 300, so that the autonomous vehicle can follow a given lane and/or maintain safe horizontal and vertical distances from an object (for example, a car in a neighboring lane) near the vehicle.
[0098] The intelligent vehicle 300 may be a car, a truck, a motorcycle, a bus, a boat, an airplane, a helicopter, a lawn mower, a recreational vehicle, a playground vehicle, a construction device, a trolley, a golf cart, a train, a handcart, or the like. This is not limited in this embodiment of this application.
[0099] It may be understood that the schematic diagram of the structure of the intelligent vehicle shown in
[0100] Based on the schematic diagrams of vehicle reminding system architectures shown in
[0101]
[0102] S410: Obtain detection information.
[0103] Specifically, an ego vehicle obtains related information of a target vehicle and related information of the ego vehicle. After the ego vehicle is started, the sensor module 210 obtains the detection information of a surrounding vehicle and the detection information of the ego vehicle. The detection information of the surrounding vehicle is obtained by the external sensing module 2120, and the detection information of the surrounding vehicle includes but is not limited to a speed and an acceleration of the surrounding vehicle, a relative distance from the surrounding vehicle to the ego vehicle, and a specific orientation of the surrounding vehicle. The detection information of the ego vehicle is obtained by the internal sensing module 2110, and the detection information of the ego vehicle includes but is not limited to information such as a speed and an acceleration of the ego vehicle.
[0104] It should be noted that, obtaining the detection information by the sensor module 210 is a first operation for the ego vehicle to identify, detect, and classify an ambient environment. The sensor module 210 needs to obtain detection information from various target objects, including but not limited to a stationary vehicle and a moving vehicle. The sensor module 210 also needs to obtain detection information of a plurality of types, including but is not limited to a speed and an acceleration of the target object, a relative distance between the target object and the ego vehicle, and a specific orientation of the target object. Therefore, performance of the sensor module 210 is critical to identification of the ambient environment of the ego vehicle.
[0105] It can be understood that different types of sensors have different features, and advantages and disadvantages. For example, a lidar uses laser beams to locate, so its positioning accuracy is high, has good depth information, and can measure a distance and a speed, but the lidar cannot detect long distances. Although a millimeter-wave radar has good range and speed measurement precision, but has a poor classification effect. A wheel speed sensor can obtain a wheel speed, and further obtain speed information of the ego vehicle. The sensor module 210 may combine and use different sensors to achieve better performance.
[0106] In this embodiment of this application, the sensor module 210 may include but is not limited to a vision sensor, a millimeter-wave radar, a lidar, and a wheel speed sensor.
[0107] S420: Trigger intelligent driving when the detection information is normal.
[0108] Specifically, when the detection information obtained by the sensor module 210 is normal, the sensor module 210 transmits the detection information to the fusion control module 220, and the fusion control module 220 processes the detection information to determine to trigger intelligent driving control. It should be noted that the detection information is normal includes that the acceleration that is of the target vehicle and that is obtained through the detection information is greater than a first threshold, that is, includes that the target vehicle does not decelerate sharply. The first threshold is a negative value, and is used to determine whether the target vehicle performs a sharp deceleration operation. The first threshold is set by development personnel based on an actual situation. This is not limited in this application.
[0109] In addition, the intelligent driving includes but is not limited to an ACC system. The ACC system may travel at a speed set by a driver, and may further implement functions of maintaining a preset vehicle following distance and automatically accelerating and decelerating as the distance changes.
[0110] For example, in a single-lane scenario, the sensor module 210 obtains detection information of a vehicle ahead (vehicle B) after an ego vehicle (vehicle A) is started. The detection information includes that a speed of the vehicle ahead (vehicle B) is 40 km/h, an acceleration is 0, and a relative distance between the vehicle ahead (vehicle B) and the ego vehicle (vehicle A) is 300 m. The sensor module 210 transmits the obtained detection information to the fusion control module 220. The fusion control module 220 analyzes the received detection information, and finds that the vehicle ahead (vehicle B) is traveling at a constant speed, and the detection information is normal. In this case, the fusion control module 220 prompts a driver to trigger a full speed ACC system, and determines that a vehicle following speed is 50 km/h. If a preset vehicle following distance is 150 m, after the driver triggers the full-speed ACC system, the ego vehicle (vehicle A) automatically accelerates to 50 km/h and keeps a vehicle following distance of 150 m from the vehicle ahead (vehicle B).
[0111] S430: When the detection information includes that the acceleration of the target vehicle is less than a first threshold, perform collision analysis, determine a collision risk level, and make a control decision.
[0112] Specifically, when the detection information obtained by the sensor module 210 includes that the acceleration of the target vehicle is less than the first threshold, that is, the detection information includes that the target vehicle performs a sharp deceleration operation, the sensor module 210 transmits the detection information to the fusion control module 220, and simultaneously triggers the fusion control module 220 to perform collision analysis, so as to obtain a collision analysis result, determine the collision risk level based on the collision analysis result, and make the control decision based on the collision risk level. The collision risk level includes a low risk, a medium risk and a high risk, and the control decision includes a low risk decision, a medium risk decision and a high risk decision.
[0113] Methods for a vehicle to perform collision analysis include but are not limited to the following methods.
(1) Vehicle Kinematics Method
[0114] The sensor module 210 obtains detection information of a target vehicle and related information such as a vehicle posture of an ego vehicle, and transmits the information to the fusion control module 220. The fusion control module 220 calculates a collision moment based on the received information. The detection information of the target vehicle includes a speed and an acceleration of the target vehicle, and a relative distance between the target vehicle and the ego vehicle, and the related information of the ego vehicle includes a speed and an acceleration of the ego vehicle. A kinematic model of the vehicle may include but is not limited to a constant velocity (CV) model, a constant acceleration (CA) model, a constant turn rate and velocity (CTRV) model, a constant turn rate and acceleration (a constant steering angle and velocity (and a constant curvature and acceleration (CCA) model. The corresponding collision moment may be obtained by using these vehicle kinematics models based on the obtained information of the target vehicle and the ego vehicle.
[0115] A collision risk determining manner may be described as follows: The collision moment is compared with a preset moment. When a ratio of the collision moment to the preset moment is less than or equal to a first relative ratio, the collision risk level is a low risk; or when a ratio of the collision moment to the preset moment is greater than the first relative ratio and is not greater than a second relative ratio, the collision risk level is a medium risk; or when the ratio of the collision moment to the preset moment is greater than the second relative ratio, the collision risk level is a high risk.
[0116] It should be noted that the preset moment is determined based on historical data analysis and a situation of the actuator of the ego vehicle.
[0117] It may be understood that the first relative ratio and the second relative ratio are set by research and development personnel based on an actual situation. This is not limited in this application.
(2) Direct Image Detection Method
[0118] A vehicle may obtain target information in an ambient environment by using the sensor module 210, and in particular, obtain an image and/or a video of the ambient environment by using the vision sensor in the sensor module 210. The sensor module 210 detects the ambient environment, obtains detection information of a target vehicle, and performs positioning and category classification on a vehicle target that appears in the image, to obtain a classification result and a minimum distance to a bounding box of the target vehicle. An image detection system and a neural network system may be included in the sensor module 210. Then, the corresponding detection information is transmitted to the fusion control module 220, and the fusion control module 220 may perform collision risk analysis in combination with the information of the ego vehicle.
[0119] A collision risk determining manner may be described as follows: The minimum distance between bounding boxes of the vehicles is compared with a safety distance. When a ratio of the minimum distance between the bounding boxes of the vehicles to the safety distance is less than or equal to a third relative ratio, the collision risk level is a high risk; or when the ratio of the minimum distance between the bounding boxes of the vehicles to the safe distance is greater than the third relative ratio and is not greater than a fourth relative ratio, the collision risk level is a medium risk; or when the ratio of the minimum distance between the bounding boxes of the vehicles to the safe distance is greater than the fourth relative ratio, the collision risk level is a low risk.
[0120] It may be understood that the third relative ratio and the fourth relative ratio are set by research and development personnel based on an actual situation. This is not limited in this application.
[0121] It should be noted that the safety distance is determined based on historical data analysis and a situation of the actuator of the ego vehicle. A key to performing collision analysis by using the direct image detection method is to obtain a clear multi-directional image. Therefore, the vision sensor plays a very important role in the method. In this case, the sensor module 210 includes but is not limited to a vision sensor, an infrared sensor, and the like.
(3) Trajectory Intersection Method
[0122] The sensor module 210 obtains detection information in real time and transmits the detection information to the fusion control module 220. The detection information includes current locations and motion status information of an ego vehicle and a target vehicle. The fusion control module 220 predicts traveling tracks of the ego vehicle and the target vehicle, and determines whether the two tracks intersect. If the two tracks intersect, the fusion control module 220 calculates a time period in which the ego vehicle arrives at an intersection point of the two tracks from the current location and a time period in which the target vehicle arrives at the intersection point of the two tracks from the current location, and calculates an absolute value of a difference between the two time periods.
[0123] A collision risk determining manner may be described as follows: The absolute value of the time difference is compared with a preset moment. When a ratio of the absolute value of the time difference to the preset moment is less than or equal to a fifth relative ratio, the collision risk level is a high risk; or when a ratio of the absolute value of the time difference to the preset moment is greater than the fifth relative ratio and is not greater than a sixth relative ratio, the collision risk level is a medium risk; or when a ratio of the absolute value of the time difference to the preset moment is greater than the sixth relative ratio, the collision risk level is a low risk.
[0124] For example,
[0125] It may be understood that the fifth relative ratio and the sixth relative ratio are set by research and development personnel based on an actual situation. This is not limited in this application.
[0126] It should be noted that the preset moment is determined based on historical data analysis and a situation of the actuator of the ego vehicle.
[0127] It should be noted that the foregoing three collision analysis methods are merely examples, and there are still many more collision analysis methods that are not described herein. The foregoing content should not be understood as a limitation on this application. In addition, during collision analysis, one collision analysis method may be used, or analysis may be performed with reference to a plurality of collision analysis methods, to determine a collision risk level based on analysis results of the plurality of collision analysis methods, and make a corresponding control decision. It may be understood that, if collision risk levels determined based on the analysis results of the plurality of collision analysis methods are different, a collision risk level representing a maximum risk degree is selected as the collision risk level of this collision analysis. For example, when collision analysis is performed in combination with the foregoing three collision analysis methods, the collision risk level obtained by using the vehicle kinematics method is a high risk, the collision risk level obtained by using the direct image detection method is medium risk, and the collision risk level obtained by using the trajectory intersection method is a low risk. Among the analysis results of the three collision analysis methods, the high risk represents the maximum risk degree, so the collision risk level obtained from collision analysis is determined as the high risk.
[0128] S440: When it is determined that there is a collision risk, transmit the control decision.
[0129] Specifically, after collision analysis is performed, the collision risk level is determined, and the control decision is made, the fusion control module 220 transmits the control decision to the reminding module 230.
[0130] It can be learned from the foregoing content that collision analysis is triggered only when the detection information is abnormal, for example, the target vehicle decelerates sharply. However, when the detection information is not always abnormal, a collision may not occur. For example, the target vehicle decelerates sharply and then accelerates sharply, and then reaches a stable speed. This situation may not affect safe driving. In this case, a result of collision analysis may be that there is no risk. When the result of collision analysis is that there is no risk, the collision risk level is not classified, and the control decision is not made.
[0131] S450: Remind the target vehicle.
[0132] Specifically, the reminding module 230 reminds the target vehicle based on the received control decision. The reminding module 230 may remind the target vehicle by controlling a sound and/or light, or may remind the target vehicle by sending a V2X message to the target vehicle.
[0133] Optionally, in a manner of reminding the target vehicle by controlling a sound and/or light, the sound may be a specific sound, and the light may alternatively be a specific light combination, for example, light flashing at a specific frequency or a specific light hardware composition. Alternatively, a combination of sounds, such as short whistles, and light may be used. It may be understood that, in a scenario where sounds are limited, use of a sound is reduced, and in this case, light is first selected.
[0134] Similarly, a V2X message may be directly sent to the target vehicle or a V2X message may be sent to the target vehicle by using an RSU to remind the target vehicle. The V2X message may include but is not limited to a collision analysis method, a collision risk level, and a moment at which a collision may occur. Optionally, the V2X message may be displayed on a V2X-related screen of the target vehicle, or may be directly played by using a speaker after successful transmission, or a combination of the two methods may be used to remind the target vehicle. For example, after the collision risk level is determined, the collision risk level is transmitted to the target vehicle, and the collision risk level is broadcast through an in-vehicle speaker.
[0135] Different reminding methods can be used for different collision risks. For example, in a reminding method of using light, light flashing at a higher frequency is used for reminding in a scenario with a high collision risk, and light flashing at a lower frequency is used for reminding in a scenario with a low collision risk.
[0136] It should be noted that different reminding methods may be used for different received control decisions. For example, in a reminding method of using light, light flashing at a higher frequency is used for reminding in a scenario with a high collision risk, and light flashing at a lower frequency is used for reminding in a scenario with a low collision risk. In a reminding method of using a V2X message, content of V2X messages transmitted in different collision risk scenarios is different. Therefore, content displayed on a V2X-related screen and/or content played by a speaker are different.
[0137] In addition, the target vehicle takes a corresponding measure after receiving a reminder from the ego vehicle. When the information received by the target vehicle indicates that the collision risk level is a high risk, the target vehicle performs braking processing, and sends a command for ensuring safety of the driver of the target vehicle. The command includes but is not limited to fastening a seat belt and adjusting a vehicle head cushion.
[0138] It should be noted that, in an entire vehicle reminding process, the sensor module 210 of the ego vehicle continuously obtains detection information to determine an ambient environment and a status of the target vehicle. Therefore, after the target vehicle is reminded, if information obtained by the sensor module 210 indicates that the risk is eliminated, the reminding module 230 of the ego vehicle sends information that the risk is eliminated to the target vehicle.
[0139] The method in embodiments of this application is described in detail above. For ease of better implementing the solutions in embodiments of this application, correspondingly, a related device used to cooperate in implementing the solutions is further provided below.
[0140] As shown in
[0141] The vehicle reminding device 500 includes a receiving unit 510, a processing unit 520, and a control unit 530.
[0142] The receiving unit 510 is configured to receive detection information, where the detection information includes related information of a target vehicle and related information of an ego vehicle that are collected by a sensor, and transmit the received detection information to the processing unit 520.
[0143] The processing unit 520 is configured to process the detection information sent by the receiving unit 510, perform collision analysis based on the detection information, make a control decision, and send the control decision to the control unit 530.
[0144] The control unit 530 is configured to remind the target vehicle based on the control decision made by the processing unit 520.
[0145] The three units may transmit data to each other by using a communication path. It should be understood that the units included in the vehicle reminding device 500 may be software units, or may be hardware units, or some units are software units and some units are hardware units.
[0146] In addition, the receiving unit 510 of the vehicle reminding device 500 actually performs operations performed by the sensor module 210, the processing unit 520 actually performs operations performed by the fusion control module 220, and the control unit 530 actually performs operations performed by the reminding module 230.
[0147]
[0148] The computing device 600 may be the vehicle reminding device in
[0149] The processor 610 may include one or more general-purpose processors, for example, a central processing unit (CPU), or a combination of a CPU and a hardware chip. The hardware chip may be an application-specific integrated circuit (ASIC), a programmable logic device (PLD), or a combination thereof. The PLD may be a complex programmable logic device (CPLD), a field-programmable gate array (FPGA), generic array logic (GAL), or any combination thereof.
[0150] The bus 640 may be a peripheral component interconnect (PCI) bus, an extended industry standard architecture (EISA) bus, or the like. The bus 640 may be classified into an address bus, a data bus, a control bus, and the like. For ease of representation, only one thick line is used to represent the bus in
[0151] The memory 630 may include a volatile memory such as a random access memory (RAM); or the memory 630 may include a non-volatile memory such as a read-only memory (ROM), a flash memory , a hard disk drive (HDD), or a solid-state drive (SSD); or the memory 630 may include a combination of the foregoing types of memories. Program code may be used to implement a functional unit shown in the vehicle reminding device 500, or is used to implement method operations in which the vehicle reminding device is used as an execution body in the method embodiment shown in
[0152] An embodiment of this application further provides a computer-readable storage medium. The computer-readable storage medium stores a computer program. When the program is executed by a processor, some or all of the operations recorded in any one of the foregoing method embodiments can be implemented, and functions of any functional unit described in
[0153] An embodiment of this application further provides a computer program product. When the computer program product is run on a computer or a processor, the computer or the processor is enabled to perform one or more operations in any one of the foregoing methods. When the foregoing modules in the device are implemented in a form of a software functional unit and sold or used as an independent product, the modules may be stored in the computer-readable storage medium.
[0154] This application further provides an intelligent vehicle. The intelligent vehicle includes the computing device shown in
[0155] In the foregoing embodiments, the description of each embodiment has respective focuses, or a part that is not described in detail in an embodiment, refer to related descriptions in other embodiments.
[0156] It should further be understood that “first”, “second”, “third”, “fourth”, and various numbers in this specification are merely used for differentiation for ease of description, and are not construed as any limitation on the scope of this application.
[0157] It should be understood that the term “and/or” in this specification describes only an association relationship between associated objects and represents that three relationships may exist. For example, A and/or B may represent the following three cases: Only A exists, both A and B exist, and only B exists. In addition, the character “/” in this specification generally indicates an “or” relationship between the associated objects.
[0158] It should be understood that sequence numbers of the foregoing processes do not mean execution sequences in various embodiments of this application. The execution sequences of the processes should be determined according to functions and internal logic of the processes, and should not be construed as any limitation on the implementation processes of embodiments of this application.
[0159] A person of ordinary skill in the art may be aware that, in combination with the examples described in embodiments disclosed in this specification, units and algorithm steps can be implemented by electronic hardware or a combination of computer software and electronic hardware. Whether the functions are performed by hardware or software depends on particular applications and design constraints of the technical solutions. A person skilled in the art may use different methods to implement the described functions for each particular application, but it should not be considered that the implementation goes beyond the scope of this application.
[0160] It may be clearly understood by a person skilled in the art that, for the purpose of convenient and brief description, for a detailed working process of the foregoing system, apparatus, and unit, refer to a corresponding process in the foregoing method embodiments, and details are not described herein again.
[0161] In the several embodiments provided in this application, it should be understood that the disclosed system, apparatus, and method may be implemented in other manners. For example, the described apparatus embodiment is merely an example. For example, division into units is merely logical function division and may be other division in an actual implementation. For example, a plurality of units or components may be combined or integrated into another system, or some features may be ignored or not performed. In addition, the displayed or discussed mutual couplings or direct couplings or communication connections may be implemented through some interfaces. The indirect couplings or communication connections between the apparatuses or units may be implemented in electrical, mechanical, or other forms.
[0162] The units described as separate components may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected based on actual requirements to achieve the objectives of the solutions of embodiments.
[0163] In addition, function units in embodiments of this application may be integrated into one processing unit, or each of the units may exist alone physically, or two or more units may be integrated into one unit.
[0164] When the functions are implemented in a form of a software functional unit and sold or used as an independent product, the functions may be stored in a computer-readable storage medium. Based on such an understanding, the technical solutions of this application essentially, or the part contributing to the conventional technology, or some of the technical solutions may be implemented in a form of a software product. The computer software product is stored in a storage medium, and includes several instructions for instructing a computer device (which may be a personal computer, a server, a network device, or the like) to perform all or some of the operations of the methods described in embodiments of this application. The foregoing storage medium includes: any medium that can store program code, such as a USB flash drive, a removable hard disk, a read-only memory (ROM), a random access memory (RAM), a magnetic disk, or an optical disc.
[0165] A sequence of the operations of the method in embodiments of this application may be adjusted, combined, or removed based on an actual requirement.
[0166] The modules in the apparatus in embodiments of this application may be combined, divided, and deleted based on an actual requirement.
[0167] In conclusion, the foregoing embodiments are merely intended for describing the technical solutions of this application, but not for limiting this application. Although this application is described in detail with reference to the foregoing embodiments, persons of ordinary skill in the art should understand that they may still make modifications to the technical solutions described in the foregoing embodiments or make equivalent replacements to some technical features thereof, without departing from the scope of the technical solutions of embodiments of this application.