METHOD AND SYSTEM FOR CONTEXT AND CONTENT AWARE SENSOR IN A VEHICLE
20200174474 ยท 2020-06-04
Inventors
Cpc classification
B60W2050/065
PERFORMING OPERATIONS; TRANSPORTING
G06V20/56
PHYSICS
B60W50/06
PERFORMING OPERATIONS; TRANSPORTING
G05D1/0088
PHYSICS
G01S2013/9316
PHYSICS
G01S2013/9322
PHYSICS
B60W2554/00
PERFORMING OPERATIONS; TRANSPORTING
B60W2420/403
PERFORMING OPERATIONS; TRANSPORTING
B60W60/001
PERFORMING OPERATIONS; TRANSPORTING
International classification
G05D1/00
PHYSICS
Abstract
A method for sampling of task relevant data in sensors in a vehicle, wherein a number of sensors are arranged in the vehicle. The method comprises receiving a task at a computer or data processing unit arranged in the vehicle, the task being associated with task information, providing a set of abstract models associated with context and content information, the abstract models including models describing traffic scenes and the context and context information describing traffic scene environment, classifying sampling of sensor data according to the task information and a selected abstract model in order to sample task relevant data, evaluating the selected abstract model based on received sensor data whether to maintain the selected abstract model or to select a new abstract model, the received sensor data representing an actual traffic scene, and adapting the classification of sampling of sensor data based on selected abstract model.
Claims
1. A method for sampling of task relevant data in sensors in a vehicle, a number of sensors being arranged in the vehicle, the method comprising: providing a task to a data processing unit arranged in the vehicle, the task being associated with task information; providing a set of abstract models associated with context and content information, the abstract models including models describing traffic scenes, and the context and context information describing traffic scene environment; classifying sampling points of sensor data according to the task information and a selected abstract model in order to sample task relevant data; evaluating the selected abstract model based on received sensor data whether to one of maintain the selected abstract model and to select a new abstract model, the received sensor data representing an actual traffic scene; and adapting the classification of sampling of sensor data based on the selected abstract model.
2. The method according to claim 1, further comprising: continuously collecting task relevant data based on received sensor data; adding the task relevant data to the task information; and providing task information to an electronic control unit of the vehicle.
3. The method according to claim 1, further comprising updating the context and content information of the abstract models with data from task relevant data.
4. The method according to claim 1, wherein the sensors include at least one of: at least one camera; at least one LIDAR; and at least one radar unit.
5. A method for sampling of task relevant data in sensors in a vehicle, a number of sensors being arranged in the vehicle, the method comprising: providing a task to a sensor data processing unit arranged in a sensor of the vehicle, the task being associated with task information; providing a set of abstract models associated with context and content information, the abstract models including models describing traffic scenes and the context and context information describing traffic scene environment; classifying sampling points of sensor data according to the task information and a selected abstract model in order to sample task relevant data; evaluating the selected abstract model based on sensor data sensed in the sensor whether to one of maintain the selected abstract model and to select a new abstract model, the sensor data representing an actual traffic scene; and adapting the classification of sampling of sensor data in the sensor based on the selected abstract model.
6. The method according to claim 5, further comprising: continuously collecting task relevant data based on received sensor data; adding the task relevant data to the task information; and providing task information to an electronic control unit of the vehicle.
7. The method according to claim 5, further comprising updating the context and content information of the abstract models with data from task relevant data.
8. The method according to claim 5, wherein the sensors include at least one of: at least one camera; at least one LIDAR; and at least one radar unit.
9. A system for sampling of task relevant data in sensors in a vehicle, a number of sensors being arranged in the vehicle, the system comprising: a data processing unit arranged in the vehicle, the data processing unit being configured to receive tasks, each task being associated with task information, the data processing unit being configured to: classify sampling points of sensor data according to task information and a selected abstract model in order to sample task relevant data, an abstract model being selected from a set of abstract models associated with context and content information, the abstract models including models describing traffic scenes, and the context and context information describing traffic scene environment; and evaluate evaluating the selected abstract model based on received sensor data and whether to one of maintain the selected abstract model and to select a new abstract model, the received sensor data representing an actual traffic scene; and adapt the classification of sampling of sensor data based on selected abstract model.
10. The system according to claim 9, wherein the data processing unit is further configured to: continuously collect task relevant data based on received sensor data; add the task relevant data to the task information; and provide task information to an electronic control unit of the vehicle.
11. The system according to claim 9, wherein the data processing unit is further configured to update the context and content information of the abstract models with data from task relevant data.
12. The system according to claim 9, wherein the sensors include at least one of: at least one camera; at least one LIDAR; and at least one radar unit.
13. A system for sampling of task relevant data in sensors in a vehicle, a number of sensors being arranged in the vehicle, at least one sensor including: a data processing unit, the data processing unit being configured to: receive tasks, each task being associated with task information; classify sampling points of sensor data according to task information and a selected abstract model in order to sample task relevant data, an abstract model being selected from a set of abstract models associated with context and content information, the abstract models including models describing traffic scenes, and the context and context information describing traffic scene environment; and evaluate the selected abstract model based on sensor data whether to one of maintain the selected abstract model and to select a new abstract model, the sensor data representing an actual traffic scene; and adapt the classification of sampling of sensor data based on selected abstract model.
14. The system according to claim 13, wherein the sensor data processing unit is further configured to: continuously collect task relevant data based on received sensor data; add the task relevant data to the task information; and provide task information to an electronic control unit of the vehicle.
15. The system according to claim 13, wherein the sensor data processing unit is further configured to update the context and content information of the abstract models with data from task relevant data.
16. The system according to claim 13, wherein the sensors include at least one of: at least one camera; at least one LIDAR; and at least one radar unit.
17. A sensor for sampling of task relevant data in a vehicle, the sensor including: a data processing unit, the data processing unit being configured to: receive tasks, each task being associated with task information; classify sampling points of sensor data according to task information and a selected abstract model in order to sample task relevant data, an abstract model being selected from a set of abstract models associated with context and content information, the abstract models including models describing traffic scenes, and the context and context information describing traffic scene environment; and evaluate the selected abstract model based on sensor data whether to one of maintain the selected abstract model and to select a new abstract model, the sensor data representing an actual traffic scene; and adapt the classification of sampling of sensor data based on selected abstract model.
18. The sensor according to claim 17, wherein the sensor data processing unit is further configured to: continuously collect task relevant data based on received sensor data; add the task relevant data to the task information; and provide task information to an electronic control unit of the vehicle.
19. The sensor according to claim 17, wherein the sensor data processing unit is further configured to update the context and content information of the abstract models with data from task relevant data.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0028] Exemplifying embodiments of the invention will be described below with reference to the accompanying drawings, in which:
[0029]
[0030]
[0031]
[0032]
[0033]
[0034]
[0035]
[0036]
DESCRIPTION OF EXEMPLIFYING EMBODIMENTS
[0037] The following is a description of exemplifying embodiments in accordance with the present invention. This description is not to be taken in limiting sense, but is made merely for the purposes of describing the general principles of the invention.
[0038] Thus, preferred embodiments of the present invention will now be described for the purpose of exemplification with reference to the accompanying drawings, wherein like numerals indicate the same elements throughout the views. It should be understood that the present invention encompasses other exemplary embodiments that comprise combinations of features as described in the following. Additionally, other exemplary embodiments of the present invention are defined in the appended claims.
[0039] First, the ADAS system will be described in general terms. An ADAS system may include one or more of the following elements of a vehicle: an adaptive cruise control (ACC); an adaptive high beam system; an adaptive light control system; an automatic parking system; an automotive night vision system; a blind spot monitor; a collision avoidance system (sometimes called a pre-collision system or a PCS); a crosswind stabilization system; a driver drawsiness detection system; a driver monitoring system; an emergency driver assistance system; a forward collision warning system; an intelligent speed adaptation system; a lane departure warning system (sometimes called a lane keep assist system or LKA); a pedestrian protection system; a traffic sign recognition system; a turning assistant; and a wrong-way driving warning system. The ADAS system may also include any software or hardware included in the vehicle that makes the vehicle be an autonomous vehicle or semi-autonomous vehicle.
[0040] Referring now to
[0041] These elements may be communicatively connected or coupled to a network 9. Although only one vehicle 5 and one server or cloud based solution and one network 9 are shown in
[0042] The vehicle 5 may include a car, a truck, a sports utility vehicle (SUV), a bus, a semi-truck, adrone, or any other roadway-based conveyance. A road-based conveyance is hardware device that traverses the top surface of a roadway.
[0043] In embodiments, the vehicle 5 may include an autonomous vehicle, a semi-autonomous vehicle or Highly Automated Vehicle (HAV). For example, the vehicle 5 may include an ADAS system 11 which is operable to make the vehicle 5 an autonomous vehicle. An HAV is a vehicle 5 whose ADAS system 11 operate at level 3 or higher as defined by the NHTSA in the policy paper Federal Automated Vehicles Policy: Accelerating the Next Revolution in Road-way safety, published in September 2016.
[0044] The vehicle 5 may further include a sensor set 8 comprising camera sensors 12a and 12b, one or more LIDAR sensors 14a and 14b, and one or more radar sensors 16a and 16b. In the present example, two camera sensors, two LIDAR senors, and two radar sensors are shown. However, any number of sensors may be included and a vehicle may often have 10-15 camera sensors as an example. It should also be understood that the location of the sensors in the figures only are schematical, i.e. for illustrative purposes only, and may be located at any suitable location in the vehicle 5 in order to satisfy its functions and purposes.
[0045] Furthermore, the vehicle 5 may include an acuator set 18, a hardware ECU 20, a communication unit 26, and GPS unit 28. The ECU 20 in turn include a processor 22, and a memory 24. These elements of the vehicle 5 is communicatively coupled to one another via a bus 30. The memory 24 may be a non-transitory computer readable memory. The memory 24 stores instructions or data that can be executed by the processor 22. The instructions or data contain code for performing the techniques described herein. In some embodiments the memory 24 may be a dynamic random access memory (DRAM) device, a static random access memory (SRAM) device, flash memory, or some other memory device. In some embodiments, the memory 24 also includes a non-volatile memory or similar permanent storage device and media including a hard disk drive, a floppy disk drive, a CD-ROM device, a DVD-ROM device, a DVD-RAM device, a DVD-RW device, a flash memory device or some other mass storage device for storing information on a more permanent basis.
[0046] The actuator set 18 includes one or more actuators of the vehicle 5. For example, the actuator set 18 includes one or more of the following: one or more hydraulic actuators, one or more electric actuators and one or more thermal actuators.
[0047] The processor 22 may include an arithmetic logic unit, a microprocessor, a general purpose controller or some other other processor array to perform computations and provide electronic display signals to a display device. The processor 22 processes data signals and may include various computing architectures including a complex instructions set computer (CISC) architecture, a reduced instruction set computer (RISC) architecture or an architecture implementing a combination of instructions sets. Although
[0048] The sensor set 8 may be operable to record data (hereinafter sensor data) that described one or more measurements of the sensors included in the sensor set 8.
[0049] As indicated above, the sensor set 8 include sensors that are operable to measure the physical environment outside the vehicle 5. For example, the sensor set 8 may record one or more physical characteristics of the physical environment that is proximate to the vehicle 5. The measurements recorded by the sensors are described by sensor data stored in the memory 24.
[0050] The sensor data may describe the physical environment proximate to the vehicle at one or more times. The sensor data may be timestamped by the sensors of the sensor set 8.
[0051] The sensor set 8 includes as, described above, sensors 12a, 12b, 14a, 14b, 16a, and 16b that are operational to measure, among other things: the physical environment, or roadway environment where the vehicle 5 is located as well as the static objects within this physical environment, the dynamic objects within the physical environment and the behaviour of these objects, the position of the vehicle 5 relative to static and dynamic objects within the physical environment (e.g. as recorded by one or more of range-finding sensors such as LIDAR) and the weather within the physical environment over time and other natural phenomena within the physical environment over time, coefficients of friction and other variables describing objects.
[0052] The communication unit 26 transmits and receives data to and from the network 9 or to another communication channel. In some embodiments, the communication unit 26 may include a DSRC transceiver, DSRC receiver and to the hardware or software necessary to make the vehicle 5 a DSRC-enabled device.
[0053] In embodiments of the present invention, the communication unit 26 includes a port for direct physical connection to the network 9 or to another communication channel. For example, the communication unit 26 includes a USB, SD, CAT-5, or similar port for wired communication with the network 9. In some embodiments, the communication unit 26 includes a wireless transceiver for exchanging data with the network 9 or other communication channel using one or more wireless communication methods, including IEEE 802.11; IEEE 802.16, BLUETOOTH, EN ISO 14906:2004 Electronic Fee Collection Application interface EN 11253:2004 Dedicated Short-Range CommunicationPhysical Layer using microwave at 5.8 GHz (review); EN 12795:2002 Dedicated Short-Range Communication (DSRC)DSRC data link layer: Medium Access and logical link control (review); EN 12834:2002 Dedicated Short-Range CommunciationApplication layer (review); EN 13372:2004 Dedicated Short-Range Communication (DSRC)DSRC profiles for RTTT applications (review).
[0054] In some embodiments, the communication unit 26 includes a cellular communication transceiver for sending and receiving data over a cellular communications network including via short messaging service (SMS), multimedia messaging service (MMS), hypertext transfer protocol (HTTP or HTTPS if the secured implementation of HTTP is used), direct data connection, WAP, email or any other suitable type of electronic communication. In some embodiments, the communication unit 26 also provides other conventional connections to the network 9 for distribution of files or media using standard network protocols including TCP/IP, HTTP, HTTPS and SMTP, millimeter wave, DSRC, etc.
[0055] The vehicle further includes one or more driving systems 23, such as, braking system and steering system 23.
[0056] A data processing unit 40 is arranged in the vehicle 5 and may be communicatively coupled to, via the bus 30, with the sensors 8 and the electronic control unit 20, as well as to other units of the vehicle such as the GPS unit 28 and communication unit 26.
[0057] The data processing unit 40 may include a processor 43, such as an arithmetic logic unit, a microprocessor, a general purpose controller or some other other processor array to perform computations and provide electronic display signals to a display device. The processor 43 may include a sensor fusion processor for merging sensor data from different sensors.
[0058] The data processing unit 40 processes data signals and may include various computing architectures including a complex instructions set computer (CISC) architecture, a reduced instruction set computer (RISC) architecture or an architecture implementing a combination of instructions sets. The data processing unit 40 may further include a memory 42. The memory 42 may be a non-transitory computer readable memory. The memory 42 stores instructions or data that can be executed by the data processing unit 40. The instructions or data contain code for performing the techniques described herein. In some embodiments the memory 42 may be a dynamic random access memory (DRAM) device, a static random access memory (SRAM) device, flash memory, or some other memory device. In some embodiments, the memory 42 also includes a non-volatile memory or similar permanent storage device and media including a hard disk drive, a floppy disk drive, a CD-ROM device, a DVD-ROM device, a DVD-RAM device, a DVD-RW device, a flash memory device or some other mass storage device for storing information on a more permanent basis.
[0059] The data processing unit 40 may be configured to receive tasks, each task being associated with task information. A task may for example be navigation to a certain location, area or place, e.g. defined by map coordinates and GPS coordinates. The task information may be information related to the task and needed to solve the task. For example, it may be coordinates, or information related to available roads, barriers along the road. The task relevant data is data from sensors that is relevant to an ongoing task and may be added to the task relevant information. For example, it may be information that describes the roadway environment outside the vehicle 5, the location of remote vehicles, objects in the roadway or in the surroundings of the roadway relative the vehicle 5, operational status of the vehicle 5 (e.g. kinematic data for the vehicle 5), map data and GPS data.
[0060] In some embodiments, the operational status of the vehicle 5 is information in the sensor data that described one or more of the following. Kinematic data for the vehicle, the latitude and longitude of the vehicle, the heading of the vehicle, braking system data (e.g. whether breaks are engaged), the elevation of the vehicle, the current time for the vehicle, the speed of the vehicle, the steering angle of the vehicle, the acceleration of the vehicle, the path history of the vehicle, an estimate of the further path of the vehicle.
[0061] The GPS data may include digital data that describes the geographic location of the vehicle, for example, latitude and longitude with lane-level accuracy.
[0062] The map data may include digital data that describes, for different combinations of latitude and longitude as indicated by GPS data, different geographical features of the roadway environment indicated by the GPS data such as the presence of curves in the roadway, whether this is a bumpy road, the average vehicular speeds along the roadway at different times of day etc.
[0063] Abstract model is a simplified model or description of a scene or environment based on vectors or surfaces. The abstract model may contain information about the scene, context information. In case of a road, it may be, for example, type of road. Further, the abstract model may include content information, which, in case of the road, may be information about bends, and/or crossings. In
[0064] The data processing unit 40 may further be configured to classify sampling of sensor data according to task information and a selected abstract model in order to sample task relevant data, wherein an abstract model is selected from a set of abstract models associated with context and content information, the abstract models including models describing traffic scenes and the context and context information describing traffic scene environment. Moreover, the data processing unit 40 is configured to evaluate the selected abstract model based on received sensor data whether to maintain the selected abstract model or to select a new abstract model, the received sensor data representing an actual traffic scene, and to adapt the classification of sampling of sensor data based on selected abstract model.
[0065] The data processing unit 40 is configured to continuously collect task relevant data based on received sensor data, adding the task relevant data to the task information and providing task information to an electronic control unit 20 of the vehicle 5. Further, the data processing unit 40 may be configured to update the context and content information of the abstract models with data from task relevant data.
[0066] The data processing unit 40 comprises a sensor data handling module 41 including six main components 44-49, as shown in
[0067] The main components are provided or implemented in the processor 43 and/or memory 42 of the data processing unit 40. The main components may in embodiments be distributed over different units, parts and systems of the vehicle 5. For example, in the data processing 40 and the sensors 8. The sensing component 44 may for example be distributed in the sensors 8.
[0068] The components include a sensing component 44, an abstracting component 45, a learning component 46, a prediction component 47, a projecting component 48, and a decision component 49 which have direct or indirect interaction with each other.
[0069] Below, each component's roll and the interaction between them will be discussed. It should be noted that each component may have different elements as operational units.
[0070] Sensing Component: the component may communicate directly with several of the same or different sensing devices 8; e.g. camera/cameras, Lidar, Ultrasound. In case of closed devices (i.e. not have opertunity to interact with the device sampling) the output of sensing component is pure data otherwise the data is chosen in interaction to other components.
[0071] Abstracting Component: the component may include several of the same or different elements where each element has information of a surface form; e.g. flat surfaces or/and different types of non-flat surfaces. Generally the elements are categorized in three levels of low, mid and high abstraction levels. In the abstracting component either the data (e.g. pure data) are used or the predefined models of elements are refined to obtain the elements.
[0072] Learning Component: the component have two major sub-component of frame learning and chain learning. In frame learning the elements of the abstracting component of each actual scene are used to learn a set of parameters related to combination of elements; e.g. scaling, angle, . . . . In chain learning there are three elements of short, mid and long chain learning. Generally in the chain learning the short, mid and long memory of scenes are used to learn a set of parameters related to the changes of combination of elements. In this component external available resources like as GPS, map data, videos and etcetra are used for both sub-components of frame and chain learning.
[0073] Prediction Component: the component has two major sub-components of generation and quality evaluation. The sub-component of generation has two major elements of surface generation and surface model (i.e. of combined surfaces) generation. Generally the seed of generation process is initiated by the learning component in combination of random processes. The sub-component of quality evaluation uses a quality matris to evaluate (accept, partially accept or reject) the generation result. The quality matris is build upon learning component parameters and physical quality parameters related to the quality space of rational physical geometry in two-dimensional manifolds.
[0074] Projecting Component: the component has two elements of search and match elements. Generally the abstraction component, learning component and prediction component provides certain numbers of abstraction models to this component which further the elements of search and match are used to project the abstracted models to the current scene. The two elements of search and match work interactively in an iteration process.
[0075] Decision Component: The component uses an element of statistical classification to determine the need of new abstraction modelling fran result of the projecting component. Generally, the projecting component attempts to find the best available abstraction model (i.e. from available models in the learning component and prediction component) to current scene. In the 2-dimensional manifolds space, a set of evaluation parameters are found which represent the goodness of projected abstraction model to the actual scene. Each of such evaluation parameter has local and global distribution (i.e. from current situation and past times). The statistical model for the classification is formed by join distribution of evaluation parameters.
[0076] The output of the sensor data handling module 41 is an abstraction model of the actual scene and the related data in relation to the surfaces obtained by the abstraction model.
[0077] With reference now to
[0078] At step 102, a task is received, which may be navigating the vehicle 5 to a certain location. For example, it may be a passenger of the vehicle 5 entering the task into a monitor module of the vehicle 5. At step 104, sampling of sensor data is classified according to the task information and a selected abstract model in order to sample task relevant data, wherein the selected abstract model is selected from a set of abstract models associated with context and content information, the abstract models including models describing traffic scenes and the context and context information describing traffic scene environment. The abstract models are stored locally in sensors 8 of the vehicle, and/or in a memory 42 of the data processing unit 40, and/or in a memory 24 of the electronic control unit 20 and/or externally of the vehicle in a server based memory 7 or cloud based memory 7.
[0079] Depending upon the received sensor data, the sampling in the sensors 8 can be increased, maintained or decreased. For example, in case of an alert situation, e.g. an animal is detected in front of the vehicle, the sampling needs to be increased drastically. Another example, is when the vehicle travels on a roadway in a desert landscape without any objects at all present, the sampling can be kept at minimum, for example, instead of 1000 pixels (which is the full size of the example sensor), 10-15 pixels can be sampled. Thereafter, at step 106, the selected abstract model is evaluated based on received sensor data in order to decide whether to maintain the selected abstract model or to select a new abstract model, the received sensor data representing an actual traffic scene. At step 108, the classification of sampling of sensor data is adapted based on selected abstract model. Thereafter, the control loop 100 determines, at step 110, if the task has been completed or accomplished, if not, the loop 100 returns to step 104. If yes, the control loop 100, closes or terminates the task, at step 112.
[0080] With reference now to
[0081] At step 202, a task is received, which may be navigating the vehicle 5 to a certain location. For example, it may be a passenger of the vehicle 5 entering the task into a monitor module of the vehicle 5. At step 204, sampling of sensor data is classified according to the task information and a selected abstract model in order to sample task relevant data, wherein the selected abstract model is selected from a set of abstract models associated with context and content information, the abstract models including models describing traffic scenes and the context and context information describing traffic scene environment. The abstract models are stored locally in sensors 8 of the vehicle, and/or in a memory 42 of the data processing unit 40, and/or in a memory 24 of the electronic control unit 20 and/or externally of the vehicle in a server based memory 7 or cloud based memory 7.
[0082] Thereafter, at step 206, the selected abstract model is evaluated based on received sensor data in order to decide whether to maintain the selected abstract model or to select a new abstract model, the received sensor data representing an actual traffic scene.
[0083] At step 207, the selected abstract model is updated, i.e the context and content information of the abstract models is updated with data from received sensor data, for example, with task relevant data.
[0084] At step 208, the classification of sampling of sensor data is adapted based on selected abstract model. Thereafter, the control loop 200 determines, at step 210, if the task has been completed or accomplished, if not, the loop 200 returns to step 204. If yes, the control loop 200, closes or terminates the task, at step 212.
[0085] Depending upon the received sensor data, the sampling in the sensors 8 can be increased, maintained or decreased. For example, in case of an alert situation, e.g. an animal is detected in front of the vehicle, the sampling needs to be increased drastically. Another example, is when the vehicle travels on a roadway in a desert landscape without any objects at all present, the sampling can be kept at minimum, for example, instead of 1000 pixels (which is the full size of the example sensor), 10-15 pixels can be sampled.
[0086] Turning now to
[0087] At step 302, a task is received, which may be navigating the vehicle 5 to a certain location. For example, it may be a passenger of the vehicle 5 entering the task into a monitor module of the vehicle 5.
[0088] At step 304, sampling of sensor data is classified according to the task information and a selected abstract model in order to sample task relevant data, wherein the selected abstract model is selected from a set of abstract models associated with context and content information, the abstract models including models describing traffic scenes and the context and context information describing traffic scene environment.
[0089] Thereafter, at step 306, the selected abstract model is evaluated based on received sensor data in order to decide whether to maintain the selected abstract model or to select a new abstract model, the received sensor data representing an actual traffic scene. The abstract models are stored locally in sensors 8 of the vehicle, and/or in a memory 42 of the data processing unit 40, and/or in a memory 24 of the electronic control unit 20 and/or externally of the vehicle in a server based memory 7 or cloud based memory 7.
[0090] Depending upon the received sensor data, the sampling in the sensors 8 can be increased, maintained or decreased. For example, in case of an alert situation, e.g. an animal is detected in front of the vehicle, the sampling needs to be increased drastically. Another example, is when the vehicle travels on a roadway in a desert landscape without any objects at all present, the sampling can be kept at minimum, for example, instead of 1000 pixels (which is the full size of the example sensor), 10-15 pixels can be sampled.
[0091] At step 307, the selected abstract model is updated, i.e the context and content information of the abstract models is updated with data from received sensor data, for example, with task relevant data.
[0092] At step 308, the classification of sampling of sensor data is adapted based on selected abstract model.
[0093] At step 309, task information is provided to a control unit of the vehicle, for example, the ECU 20, which may send instructions to different other units or systems of the vehicle such as to the the braking system and/or steerings system 23. However, this step is not necessarily executed after step 308 but can be executed in connection with other step and/or continuously during the control loop 300.
[0094] Thereafter, the control loop 300 determines, at step 310, if the task has been completed or accomplished, if not, the loop 300 returns to step 304. If yes, the control loop 300, closes or terminates the task, at step 312.
[0095] Referring now to
[0096] These elements may be communicatively connected or coupled to a network 9. Although only one vehicle 50 and one server or cloud based solution and one network 9 are shown in
[0097] The vehicle 50 may include a car, a truck, a sports utility vehicle (SUV), a bus, a semi-truck, adrone, or any other roadway-based conveyance. A road-based conveyance is hardware device that traverses the top surface of a roadway.
[0098] In embodiments, the vehicle 50 may include an autonomous vehicle, a semi-autonomous vehicle or Highly Automated Vehicle (HAV). For example, the vehicle 50 may include an ADAS system 11 which is operable to make the vehicle 50 an autonomous vehicle. An HAV is a vehicle 50 whose ADAS system 11 operate at level 3 or higher as defined by the NHTSA in the policy paper Federal Automated Vehicles Policy: Accelerating the Next Revolution in Road-way safety, published in September 2016.
[0099] The vehicle 50 may further include a sensor set 508 comprising camera sensors 512a and 512b, one or more LIDAR sensors 514a and 514b, and one or more radar sensors 516a and 516b. In the present example, two camera sensors, two LIDAR senors, and two radar sensors are shown. However, any number of sensors may be included and a vehicle may often have 10-15 camera sensors as an example. It should also be understood that the location of the sensors in the figures only are schematical, i.e. for illustrative purposes only, and may be located at any suitable location in the vehicle 5 in order to satisfy its functions and purposes.
[0100] Furthermore, the vehicle 50 may include an acuator set 18, a hardware ECU 20, a communication unit 26, and GPS unit 28. The ECU 20 in turn include a processor 22, and a memory 24. These elements of the vehicle 5 is communicatively coupled to one another via a bus 30. The memory 24 may be a non-transitory computer readable memory. The memory 24 stores instructions or data that can be executed by the processor 22. The instructions or data contain code for performing the techniques described herein. In some embodiments the memory 24 may be a dynamic random access memory (DRAM) device, a static random access memory (SRAM) device, flash memory, or some other memory device. In some embodiments, the memory 24 also includes a non-volatile memory or similar permanent storage device and media including a hard disk drive, a floppy disk drive, a CD-ROM device, a DVD-ROM device, a DVD-RAM device, a DVD-RW device, a flash memory device or some other mass storage device for storing information on a more permanent basis.
[0101] The actuator set 18 includes one or more actuators of the vehicle 50. For example, the actuator set 18 includes one or more of the following: one or more hydraulic actuators, one or more electric actuators and one or more thermal actuators.
[0102] The processor 22 may include an arithmetic logic unit, a microprocessor, a general purpose controller or some other other processor array to perform computations and provide electronic display signals to a display device. The processor 22 processes data signals and may include various computing architectures including a complex instructions set computer (CISC) architecture, a reduced instruction set computer (RISC) architecture or an architecture implementing a combination of instructions sets. Although
[0103] The sensor set 508 includes camera sensors 512a, 512b, LIDAR sensors 514a, 514b and radar sensors 516a, 516b. However, it may further include millimeter wave radar, a speed sensor, a laser altimeter, a navigation sensor (e.g. a global positioning system sensor of the GPS unit), an infrared detector, a motion detector, a thermostat, a sound detector, a carbon monoxide detector, an oxygen sensor, a mass air flow sensor, an engine coolant temperature sensor, a throttle position sensor, a crank shaft position sensor, an automobile engine sensor, a valve timer, an air-fuel ratio meter, a blind spot meter, a curb feeler, a defect meter, a Hall effect sensor, a manifold absolute pressure sensor, a parking sensor, a radar gun, a speedometer, a transmission fluid temperature sensor, a turbine speed sensor, a variable reluctance sensor, a wheel speed sensor, and any type of automotive sensor that may be present in an HAV.
[0104] The sensor set 508 may be operable to record data (hereinafter sensor data) that described one or more measurements of the sensors included in the sensor set 508.
[0105] As indicated above, the sensor set 508 include sensors that are operable to measure the physical environment outside the vehicle 50. For example, the sensor set 508 may record one or more physical characteristics of the physical environment that is proximate to the vehicle 50. The measurements recorded by the sensors are described by sensor data may be stored in the memory 24 or in sensor data processing unit 540, which will be described below.
[0106] The sensor data may describe the physical environment proximate to the vehicle at one or more times. The sensor data may be timestamped by the sensors of the sensor set 508.
[0107] The sensor set 508 includes as, described above, sensors 512a, 512b, 514a, 514b, 516a, and 516b that are operational to measure, among other things: the physical environment, or roadway environment where the vehicle 50 is located as well as the static objects within this physical environment, the dynamic objects within the physical environment and the behaviour of these objects, the position of the vehicle 50 relative to static and dynamic objects within the physical environment (e.g. as recorded by one or more of range-finding sensors such as LIDAR) and the weather within the physical environment over time and other natural phenomena within the physical environment over time, coefficients of friction and other variables describing objects.
[0108] The communication unit 26 transmits and receives data to and from the network 9 or to another communication channel. In some embodiments, the communication unit 26 may include a DSRC transceiver, DSRC receiver and to the hardware or software necessary to make the vehicle 5 a DSRC-enabled device.
[0109] In embodiments of the present invention, the communication unit 26 includes a port for direct physical connection to the network 9 or to another communication channel. For example, the communication unit 26 includes a USB, SD, CAT-5, or similar port for wired communication with the network 9. In some embodiments, the communication unit 26 includes a wireless transceiver for exchanging data with the network 9 or other communication channel using one or more wireless communication methods, including IEEE 802.11; IEEE 802.16, BLUETOOTH, EN ISO 14906:2004 Electronic Fee Collection Application interface EN 11253:2004 Dedicated Short-Range Communication Physical Layer using microwave at 5.8 GHz (review); EN 12795:2002 Dedicated Short-Range Communication (DSRC)DSRC data link layer: Medium Access and logical link control (review); EN 12834:2002 Dedicated Short-Range CommunciationApplication layer (review); EN 13372:2004 Dedicated Short-Range Communciaiton (DSRC)DSRC profiles for RTTT aoolicatoins (review).
[0110] In some embodiments, the communication unit 26 includes a cellular communication transceiver for sending and receiving data over a cellular communications network including via short messaging service (SMS), multimedia messaging service (MMS), hypertext transfer protocol (HTTP or HTTPS if the secured implementation of HTTP is used), direct data connection, WAP, email or any other suitable type of electronic communication. In some embodiments, the communication unit 26 also provides other conventional connections to the network 9 for distribution of files or media using standard network protocols including TCP/IP, HTTP, HTTPS and SMTP, millimeter wave, DSRC, etc.
[0111] The vehicle further includes one or more driving systems 23, such as, braking system and steering system 23.
[0112] In this embodiment of the present invention, the sensors 512a, 512b, 514a, 514b, 516a, and 516b include a sensor data processing unit 540 is arranged in the vehicle 50 and may be communicatively coupled to, via the bus 30, the electronic control unit 20, as well as to other units of the vehicle such as the GPS unit 28 and communication unit 26. Preferably, each sensor includes one sensor data processing unit 540.
[0113] The sensor data processing unit 540 include a processor, such as an arithmetic logic unit, a microprocessor, a general purpose controller or some other other processor array to perform computations and provide electronic display signals to a display device. The data processing unit 40 processes data signals and may include various computing architectures including a complex instructions set computer (CISC) architecture, a reduced instruction set computer (RISC) architecture or an architecture implementing a combination of instructions sets. The sensor data processing unit 540 may further include a memory. The memory may be a non-transitory computer readable memory. The memory stores instructions or data that can be executed by the sensor data processing unit 540. The instructions or data contain code for performing the techniques described herein. In some embodiments the memory may be a dynamic random access memory (DRAM) device, a static random access memory (SRAM) device, flash memory, or some other memory device. In some embodiments, the memory also includes a non-volatile memory or similar permanent storage device and media including a hard disk drive, a floppy disk drive, a CD-ROM device, a DVD-ROM device, a DVD-RAM device, a DVD-RW device, a flash memory device or some other mass storage device for storing information on a more permanent basis.
[0114] The sensor data processing unit 540 may be configured to receive tasks, each task being associated with task information. A task may for example be navigation to a certain location, area or place, e.g. defined by map coordinates and GPS coordinates. The task information may be information related to the task and needed to solve the task. For example, it may be coordinates, or information related to available roads, barriers along the road. The task relevant data is data from sensors that is relevant to an ongoing task and may be added to the task relevant information. For example, it may be information that describes the roadway environment outside the vehicle 50, the location of remote vehicles, objects in the roadway or in the surroundings of the roadway relative the vehicle 50, operational status of the vehicle 50 (e.g. kinematic data for the vehicle 50), Map data and GPS data.
[0115] In some embodiments, the operational status of the vehicle 50 is information in the sensor data that described one or more of the following. Kinematic data for the vehicle, the latitude and longitude of the vehicle, the heading of the vehicle, braking system data (e.g. whether breaks are engaged), the elevation of the vehicle, the current time for the vehicle, the speed of the vehicle, the steering angle of the vehicle, the acceleration of the vehicle, the path history of the vehicle, an estimate of the further path of the vehicle.
[0116] The GPS data may include digital data that describes the geographic location of the vehicle, for example, latitude and longitude with lane-level accuracy.
[0117] The map data may include digital data that describes, for different combinations of latitude and longitude as indicated by GPS data, different geographical features of the roadway environment indicated by the GPS data such as the presence of curves in the roadway, whether this is a bumpy road, the average vehicular speeds along the roadway at different times of day etc.
[0118] Abstract model is a simplified model or description of a scene or environment based on vectors or surfaces. The abstract model may contain information about the scene, context information. In case of a road, it may be, for example, type of road. Further, the abstract model may include content information, which, in case of the road, may be information about bends, and/or crossings. In
[0119] The sensor data processing unit 540 may further be configured to classify sampling of sensor data according to task information and a selected abstract model in order to sample task relevant data, wherein an abstract model is selected from a set of abstract models associated with context and content information, the abstract models including models describing traffic scenes and the context and context information describing traffic scene environment. Moreover, the sensor data processing unit 540 is configured to evaluate the selected abstract model based on received sensor data whether to maintain the selected abstract model or to select a new abstract model, the received sensor data representing an actual traffic scene, and to adapt the classification of sampling of sensor data based on selected abstract model.
[0120] The sensor data processing unit 540 is configured to continuously collect task relevant data based on received sensor data, adding the task relevant data to the task information and providing task information to an electronic control unit 20 of the vehicle 50. Further, the sensor data processing unit 540 may be configured to update the context and content information of the abstract models with data from task relevant data.
[0121] The sensor data processing unit 540 comprises a sensor data handling module 41 as shown in
[0122] With reference now to
[0123] At step 802, a task is received, which may be navigating the vehicle 50 to a certain location. For example, it may be a passenger of the vehicle 50 entering the task into a monitor module of the vehicle 50. Preferably, each sensor receives the task, or the sensor data processing unit 540 of each sensor receives the task. At step 804, sampling of sensor data is classified according to the task information and a selected abstract model in order to sample task relevant data in each respective sensor, wherein the selected abstract model is selected from a set of abstract models associated with context and content information, the abstract models including models describing traffic scenes and the context and context information describing traffic scene environment. The abstract models are stored locally in sensors 508 of the vehicle, and/or in a memory 24 of the electronic control unit 20 and/or externally of the vehicle in a server based memory 7 or cloud based memory 7. A sensor can select an abstract model starting from a received position indication, for example, received over the network 9 or the GPS 28.
[0124] Depending upon the sensed sensor data in respective sensor 508, the sampling in the respective sensors 508 can be increased, maintained or decreased. For example, in case of an alert situation, e.g. an animal is detected in front of the vehicle, the sampling needs to be increased drastically. Another example, is when the vehicle travels on a roadway in a desert landscape without any objects at all present, the sampling can be kept at minimum, for example, instead of 1000 pixels (which is the full size of the example sensor), 10-15 pixels can be sampled.
[0125] Thereafter, at step 806, the selected abstract model is evaluated based on sensor data in order to decide whether to maintain the selected abstract model or to select a new abstract model, the sensor data representing an actual traffic scene. At step 808, the classification of sampling of sensor data in respective sensor 508 is adapted based on selected abstract model. Thereafter, the control loop 800 determines, at step 810, if the task has been completed or accomplished, if not, the loop 800 returns to step 804. If yes, the control loop 800, closes or terminates the task, at step 812.
[0126] Although exemplary embodiments of the present invention have been shown and described, it will be apparent to those having ordinary skill in the art that a number of changes, modifications, or alterations to the inventions as described herein may be made. Thus, it is to be understood that the above description of the invention and the accompanying drawings is to be regarded as a non-limiting example thereof and that the scope of protection is defined by the appended patent claims.