METHOD AND SYSTEM FOR OPERATING A MOBILE ROBOT
20210380119 · 2021-12-09
Inventors
Cpc classification
G01S13/40
PHYSICS
G06V20/58
PHYSICS
B60W2552/05
PERFORMING OPERATIONS; TRANSPORTING
G01S15/86
PHYSICS
B60W2050/0215
PERFORMING OPERATIONS; TRANSPORTING
International classification
Abstract
The present invention relates to a method comprising obtaining validation sensor data from a sensor measurement at a validation observation time; generating a validation finding based on the validation sensor data; obtaining first sensor data from a sensor measurement at an observation time preceding the validation observation time; generating a first finding based on the first sensor data; and testing the first finding based on the validation finding. The present invention also relates to a corresponding method and a corresponding use.
Claims
1. A method comprising obtaining validation sensor data from a sensor measurement at a validation observation time; generating a validation finding based on the validation sensor data; obtaining first sensor data from a sensor measurement at an observation time preceding the validation observation time; generating a first finding based on the first sensor data; and testing the first finding based on the validation finding.
2. The method according to claim 1, wherein the method comprises a robot driving in an environment, wherein the robot comprises a sensor unit, and wherein the method further comprises the sensor unit generating initial validation sensor data and initial first sensor data, wherein the validation sensor data is based on the initial validation sensor data and the first sensor data is based on the initial first sensor data.
3. The method according to claim 2, wherein the validation finding relates to a presence of a vehicle on a road; the first finding relates to a presence of a vehicle on the road; and when the initial validation sensor data is generated, the robot is closer to the vehicle than when the initial first sensor data is generated.
4. The method according to claim 3, wherein the method further comprises: processing the initial validation sensor data to generate the validation sensor data, wherein a quotient between the initial validation sensor data and the validation sensor data is greater than 10, preferably greater than 1,000, further preferably greater than 100,000, such as greater than 1,000,000; and processing the initial first sensor data to generate the first sensor data, wherein a quotient between the initial first sensor data and the first sensor data is greater than 10, preferably greater than 1,000, further preferably greater than 100,000, such as greater than 1,000,000.
5. The method according to claim 3, wherein the step of testing the first finding based on the validation finding is triggered when the validation finding indicates that a vehicle is present on the road.
6. The method according to claim 1, wherein the observation time precedes the validation observation time by 1 s to 20 s, preferably by 1 s to 10 s, such as by 1 s to 5 s.
7. The method according to claim 2, wherein the method comprises: generating a plurality of validation findings; and generating a plurality of first findings and testing each of the plurality of the first findings based on a validation finding, and thus creating a plurality of test results, wherein the method further comprises utilizing the test results to determine a detection performance of the robot.
8. The method according to claim 7, wherein the method comprises: a plurality of robots driving in the environment, and wherein each of the steps is performed for each of the robots to thus determine a detection performance for each of the robots.
9. The method according to claim 8, wherein the method further comprises comparing detection performances of the robots to detect malfunctions.
10. The method according to claim 7, wherein the method further comprises generating a safety score fora road based on the plurality of test results, wherein the safety score is preferably based on a percentage of false negatives within the test results.
11. The method according to claim 1, wherein the method further comprises: obtaining additional validation sensor data from a sensor measurement at an additional validation observation time, wherein the validation finding is generated based on the validation sensor data and on the additional validation sensor data.
12. The method according to claim 11, wherein the additional validation observation time is within 2000 ms, preferably 1000 ms, further preferably 500 ms of the validation observation time.
13. The method according to claim 1, wherein the step of testing the first finding based on the validation finding is performed at least 30 s, preferably at least 1 minute, further preferably at least 10 minutes, such as at least 20 minutes, after the validation observation time.
14. A system configured to carry out the method according to claim 1.
15. Use of the system according to claim 14 for carrying out the method according to claim 1.
Description
[0136] The present invention will now be described with reference to the accompanying drawings which illustrate embodiments of the invention. These embodiments should only exemplify, but not limit, the present invention.
[0137]
[0138]
[0139]
[0140]
[0141]
[0142]
[0143]
[0144]
[0145] It is noted that not all the drawings carry all the reference signs. Instead, in some of the drawings, some of the reference signs have been omitted for sake of brevity and simplicity of illustration. Embodiments of the present invention will now be described with reference to the accompanying drawings.
[0146] Initially, general embodiments of the present technology will be described with general reference to multiple drawings, before describing the individual drawings and their components in greater detail.
[0147] As for example depicted in
[0148] After some time, e.g., 2 to 4 seconds later, the robot 10 may use the same or other sensors to detect whether a car has passed in front of it on the traffic road 20. The sensors employed in the present technology (both for the first detection and for the subsequent detection) may include cameras, ultrasonic sensors, radars, and/or time of flight sensors.
[0149] That is, in simple words and with exemplary reference to
[0150] In some embodiments, the present technology may also include a matching process to match an object at the observation time t.sub.obs=t.sub.2 to an object at the first observation time t.sub.1. For example, the direction which a vehicle comes from may be used, but also other parameters such as time to passing in front of the robot based on the detection at the first observation time t.sub.1. Generally, time, distance, speed and/or acceleration may be used for such a matching process. For example, if at the observation time t.sub.obs=t.sub.2, a car is detected to pass with a velocity of 5 m/s, then this information can be used to do some matching with previous images. The acceleration may be calculated based on the additional validation data in the close range of the validation observation time. Furthermore, not only the speed (and acceleration) at the instance of validation observation time could be used, but also used the time-evolution of the speed (and acceleration) in between the first observation time and the validation observation time may be used. That time-series data may be obtained by following an approaching object back in time from the validation observation time, which may also be referred to as tracking an (approaching) object in a traffic environment 1000 over time.
[0151] Generally, it will be understood that the following situations may occur: [0152] A car passes in front of the robot, and it was previously detected with far-range sensors (true positive)—see
[0156] Out of the above four scenarios, the most critical one is false negatives, since these can lead to accidents. False positives are also undesirable, since they can lead to lower efficiency and lower average speed of the robot (since it would wait longer to cross the road out of overabundance of caution).
[0157] It will be understood that the presently described validation method uses future data to validate present data or present data to validate past data. The presently described technology is useful, e.g., for testing the quality of far-range car detectors. Further, it will also be understood that the present method can be used to automatically annotate data, which can subsequently be used, e.g., for setting detection thresholds for detectors.
[0158] Generally, it should be understood that the presently described detection and validation method may be employed both locally on the robot 10 (which may also be referred to as “live”) and external to the robot 10 (which may also be referred to as “offline”).
[0159] In the offline case, sensor data may be sent to a data processing unit external to the robot 10 (e.g., a server) and the detection algorithms may be run on the data processing unit, and the method may be performed on the server. Thus, detection algorithms can be tested using the sensor data as an input.
[0160] In the local case, i.e., when the method is employed live on the robot 10, it is possible that the detection of a passing car (see, e.g.,
[0161] It will be understood that it is generally not necessary that the presently described technology runs instantly on the robot 10. Consider, e.g., again the case of
[0162] In other words, e.g., in order to save the raw data of the initial detection (real car detection), the passing car detections can be calculated, e.g., 30 minutes later. In embodiments of the present technology, the robot 10 may only override the raw sensor data about 1 hour after obtaining it. Thus, part of the presently described routines may be performed with a substantial delay (such as 1 minute, 5 minutes, 10 minutes, or even 30 minutes) after the raw data has been obtained. For example, this means that such steps can be performed, e.g., after the crossing has been finished.
[0163] More particularly referring to
[0164] Furthermore, the mobile robot 10 may be required to travel in a plurality of different traffic environments 1000, which may, inter alia, imply traveling on sidewalks, bike lanes and/or driveways. The mobile robot 10 may also be assigned tasks requiring, for example, crossing roads and it will also be understood that such scenarios may require execution of further steps to bring the assigned tasks to a successful completion. When a mobile robot 10 is traveling on sidewalks, the tasks may mainly be focused on the safe interaction of the mobile robot 10 with other traffic participants of sidewalks, e.g. pedestrians. However, when the mobile robot 10 is approaching a road 20, additional traffic participants have to be considered, for instance, besides pedestrians, the mobile robot 10 may encounter a plurality of driving vehicles, which may carry along a higher probability of occurrence of endangering scenarios, such as, for example, higher probability of collision of the mobile robot by and/or with a driving vehicle. The road 20 may also be referred to as traffic road(s) 20, pedestrian cross 20 road segment 20, stretch 20 or simply as segment 20. It will be understood that it may also imply other type of roads, such as, for example, crossroads.
[0165] In
[0166] In one embodiment, the mobile robot 10 may comprise at least one sensor configured to detect driving vehicles. The at least one sensor may also be referred to as detecting component 200 (see
[0167] In another scenario of the traffic environment 1000, the mobile robot may determine that the moving vehicles 40 and 50 do not represent any risk of collision, therefore the mobile robot 10 may continue its planned route. In simple words, if the mobile robot determines that the vehicle 40 and 50 are moving on the road 20, but their route will not obstruct its trajectory, e.g. the moving objects are reducing the speed and prompt to stop (at a traffic light) or the distance of the moving to the mobile robot is within safe thresholds, the mobile robot 10 may consider crossing the road without interruption, i.e. without having to wait for the moving vehicles. In some instances, such a decision may be advantageous, as it may allow optimization of the traveling of the mobile robot 10, for example, by avoiding unnecessary interruptions of the journey, which may result in an increase of average traveling speed of the mobile robot 10, i.e. it may allow to reduce traveling times between an initial point and a final destination, which subsequently permit the mobile robot 10 to efficiently perform several tasks in a reduced time. In the present invention, the mobile robot 10 may be configured to travel at different speeds according to the speed of the traffic environment 1000 and such speeds may be in the range 0 to 30 km/h, preferably 0 to 10 km/h, more preferably 0 to 6 km/h—as the robot 10 may particularly operate on a sidewalk. Therefore, it will be also understood that whenever referring to the average speed and maximizing and/or optimizing the average speed of the mobile robot 10, the speed of the mobile robot 10 does not exceed 30 km/h. It will also be understood that the speed of the mobile robot 10 is configured to be adjusted to the average speed of the traffic environment 1000, e.g. if a pedestrian is circulating in front of the mobile robot 10, the mobile robot 10 may be able to adjust its traveling speed to the speed of the pedestrian.
[0168] The mobile robot 10 may evaluate a plurality of traffic environments 1000, which may consequently lead to several possible outcomes impacting on the decision made by the mobile robot 10. For instance, the mobile robot 10 may detect one or several vehicles approaching a common road 20, thus the mobile robot 10 may be required to evaluate the situation and make a decision, e.g. stopping at the road 20 and wait until the traffic environment 1000 is cleared, i.e. until the detected vehicles have passed. Such possible scenarios are explained in detail below.
[0169]
[0170] As discussed, the system 100 may also comprise a processing component, conceptually identified by reference numeral 300. The processing component 300 may be configured to retrieve information from the sensors 200, as initial sensor data 202, and may further be configured to process the sensor data 202 to generate sensor data, conceptually identified by reference numeral 302. The sensor data 302 may also be referred to as processed sensor data 302. The processed sensor data 302 may comprise information relating to velocities (such as speeds and directions) of detected objects.
[0171] The sensor data 302 may be provided to an additional processing component 400. The additional processing component 400 may generate a first finding or a first hypothesis 402 based on first sensor data and a validation finding 404 based on validation sensor data, wherein the first sensor data is based on a measurement preceding the measurement forming the basis for the validation sensor data. These findings 402, 404 may be compared with one another to thus test the first finding and to determine the validity of the first finding 402.
[0172] The findings 402, 404 may contain analyzed information regarding the traffic environment 1000. For instance, the first finding may contain information such as a list of identified objects, which may comprise potentially moving objects as well as potentially static objects, which may be of interest for the mobile robot 10 to execute further tasks such as stopping at a given point, rerouting its trajectories, etc. Further, examples of identified objects as potentially moving objects may comprise vehicles moving away from and/or towards the mobile robot 10 such as cars, buses, tracks, bicycles (representing first findings 402). Moreover, a potentially moving object may also represent a person walking away from and/or towards the mobile robot 10. Examples of identified objects as potentially static objects may comprise parked vehicles such as cars, buses, trucks, bicycles. Moreover, a potentially static object may also represent a traffic light at a pedestrian crossing 20, and for example a person standing at the traffic light and waiting for traffic light clearance to cross the road, etc.
[0173] That is, in one embodiment, the additional processing component 400 may also be configured to generate a plurality of first findings or hypotheses regarding the identified objects and may also further analyze a plurality of possible outcoming scenarios. For instance, the processing component may be configured to determine whether a detected object may in fact be a moving object and whether the moving object may be moving towards the mobile robot 10.
[0174] In another embodiment, the additional processing component 400 may further be configured to evaluate the possible consequence of a plurality of scenarios. For instance, the additional processing component 400 may be able to infer whether a given scenario may result in safety endangering event, such as, for example, a collision of the mobile robot 10 with a moving object.
[0175] It will be understood that the additional processing component 400 may comprise a plurality of algorithms configured to execute the detection of objects in a traffic environment 1000.
[0176] The additional processing component 400 may be realized as a server 400 external from and remote from the robot 10. It will be understood that the server 400 may comprise at least one server 400 and therefore may also be referred to as servers 400. It will also be understood that the server 400 may also comprise a remote server and/or a cloud server.
[0177] In some embodiments, the mobile robot 10 may collect initial sensor data 202 via a detecting component 200, may process the initial sensor data 202 by a processing components 300 (which may be part of the robot 10) to generate (processed) sensor data 302 and subsequently may send this sensor data 302 to the server 400 for further processing. Therefore, it will be understood that in some embodiments, the processing component 300 inbuilt in the mobile robot 10 may, for example, perform a pre-processing of the initial sensor data 202 and provide the information to a server 400 as processed sensor data 302. It will also be understood that the processed data 302 may be subjected to further analysis in the server 400. For this purpose, the server 400 may be bidirectionally connected to the processing component 300 and this bidirectional connection may be advantageous, as it may allow the mobile robot 10 to retrieve information from the processing component 300 and implement them to bring its assigned tasks to successful completion. Furthermore, further analyzing the processed sensor data 302 in a server 400 may also be advantageous in some instances, as the server 400 may comprise further and more advanced processes such as, for example, additional algorithms, pattern recognitions, machine learning, advance artificial intelligence, etc. Moreover, the server 400 may also comprise further storing modules configured to generate a data base, i.e. enough storing capacity to generate historical data and parameters. The historical data may, for instance, comprise historical records of events of a plurality of traffic environments 1000 (e.g. crossroad, road segments, sidewalks, etc.), e.g., hypotheses that were computed and their corresponding outcomes (e.g. a false negative, true positive, etc.). In some instances, the historical records of events may also include further information such as safety records, number of safety endangering scenarios (e.g. numbers of accidents such as collisions).
[0178] In some instances, a computer vision may be advantageous, as it may facilitate the mobile robot 10 to understand the traffic environment 1000 in order to implement information for execution of further tasks, e.g. automated decision-making processes such as stopping at a road to allow a moving vehicle to continue its route.
[0179] In simple words, the system 100 may comprise a detection component 200 configured to collect initial sensor data 202, which may be subsequently provided to a processing component 300 to generate sensor data 302, that may also be referred to as processed sensor data 302. The processed sensor data 302 may contain a plurality of parameter and data that may allow the mobile robot 10 to correctly execute assigned tasks in a traffic environment 1000.
[0180] That is, in summary, the system 100 depicted in
[0181] The detection component 200 may sense the surroundings of the robot 10, e.g., it may comprise cameras and/or radar sensors. Thus, initial sensor data 202 (e.g., images or radar data) may be created. The initial sensor data 202 may be processed by processing component 300 to generate processed sensor data 302. The processed sensor data 302 may comprise information relating to objects that were detected, e.g., it may comprise information relating to a size of an object, its speed, its direction of movement and its distance to the robot. The processed sensor data 302 may be provided to an additional processing component 400 that may generate findings 402 and 404 by utilizing the sensor data 302.
[0182] More particularly, the additional processing component 400 may generate a first finding 402 based on first sensor data and a validation finding 404 based on validation sensor data. The validation sensor data may be based on measurements taking place after the measurements on which the first sensor data is based. For example, the validation sensor data (and thus the validation finding 404) may be based on a measurement of a car that just passed in front of the robot 10, while the first sensor data (and thus the first finding 402) is based on a previous measurement of the car, i.e., on a measurement of the car while it was approaching. It will be understood that the validation finding (based on the car passing in front of the robot 10) may typically be more accurate than the first finding (based on observing a car having a substantial distance to the robot). Thus, the validation finding 404 may be used to test (i.e., validate) the initial finding.
[0183]
[0184] Afterwards, the surveyed sensor data may be run through algorithms and/or detectors designed to detect faraway moving objects, e.g. faraway driving vehicles, such as cars and/or buses. Multiple sensor data may be combined at this point as well. These processes may be executed directly on the robot and the output can comprise a probability of an approaching object from either direction, e.g. driving car(s) and/or bus(es). Alternatively or additionally, the output may simply comprise a binary result, e.g. YES/NO, indicating the presence or absence of approaching objects. Furthermore, as part of the algorithms and/or detectors, there may be a certain threshold above which it may be considered a detection, and below which may be not. Such an output may comprise a finding (also referred to as hypothesis) on the presence of approaching vehicles on either side of the robot, on the segment of the road accessible to the sensors of the mobile robot 10. In one embodiment, a preferred combination of sensors for far-range object detection may be, for example, cameras and frequency shift keying radar (FSK).
[0185]
[0186] In simple words, in
[0187] If the object 30 detected in t.sub.1 is later present in a second sensor data collected in t.sub.2=t.sub.obs (where t.sub.2=t.sub.obs is the validation observation time), the mobile robot 10 may record the event as a true occurrence, which may also be referred to as a true positive. In simple words, if an object 30 is detected by the mobile robot in a first sensor measurement, and once again is detected in a second sensor measurement, the mobile robot 10 may confirm that the detected object 30 was in fact present moving towards its positioning. In even more simple words, if an object 30, e.g. a car, passes in front of the mobile robot 10 and this object 30 was previously detected via a sensor, e.g. via a far-range sensor, the far-range detection is considered a true positive detection or simply a true positive.
[0188] In
[0189] In simple words, in
[0190] That is, generally,
[0191] If the object 30 detected in t.sub.1 is later not present in a second sensor data collected in t.sub.2=t.sub.obs, the mobile robot 10 may record the event as a false occurrence, which may also be referred to as a false positive. In simple words, if an object 30 is detected by the mobile robot in a first sensor measurement but is not detected in a second sensor measurement, the mobile robot 10 may record that the detected object 30 was not moving towards its positioning. In even more simple words, if no object 30, e.g. a car, passes in front of the mobile robot 10 but this object 30 was previously detected via a sensor, e.g. via a far-range sensor, the passing car detection is considered a false positive detection or simply as a false positive.
[0192] Consider an exemplary scenario in
[0193] Moreover, the sensors 200 of the mobile robot 10 may also detect, for example, three moving objects of similar size moving towards the mobile robot but still on the opposite side of the road 20, conceptually identified as objects 62, 64 and 66. Furthermore, the mobile robot 10 may detect an additional static object with similar dimensions to that of objects 62, 64 and 66, positioned just in front of the mobile robot 10 and conceptually identified with reference numeral 60. Next to the object 60, the sensors 200 of the mobile 10 may also detect a static longitudinal object conceptually identified with reference numeral 68. The object 68 may further be identified with dimensions such that the mobile robot 10 may label the object 68 as, for example, a traffic light. Additionally, the sensors 200 may also provide information to mobile robot 10 that there are additional potentially moving objects, such as, for example, objects 40 and/or 50, with their corresponding characteristics as mentioned earlier. Such a comprehensive detection of different object in the traffic environment 1000 may represent a simple example of the computer vision of the mobile robot 10.
[0194] However, in another scenario, it may be possible that no objects are detected in a first measurement schematically depicted in
[0195] In
[0196] In simple words, the at least one sensor inbuilt in the mobile robot 10 may allow the collection of sensor data, which after a data processing step, may permit inferring that the surroundings of current position of the mobile robot 10 is cleared of other traffic participants that may converge with the trajectory of the mobile 10, e.g. moving vehicles. This identified scenario may serve as the basis for determining the probability of the correct detection executed by the mobile robot 10. In other words, the potential scenario assigned as free of moving objects may provide the input to validate whether, for example, the traffic conditions of the road 20 may allow the mobile robot 10 to continue moving without modifying its planned trajectory, e.g. without stopping at the road 20. However, when the sensors of the mobile robot 10, in a second measurement at t.sub.2=t.sub.obs, detects a potentially moving object 30, the mobile 10 may confirm that the sensor data collected in t.sub.1 was not containing true information (or was interpreted wrongly), i.e. data collected in a first measurement led to a false negative. In more simple words, if no potentially moving objects 30 are identified in a first sensor measurement, but a moving object 30 passes in front of the mobile robot 10, thus identified by a second sensor measurement, the passing car detection is considered to be a false negative detection or simple as a false negative. In even more simple words, a moving object 30, e.g. a car, passes in front of the mobile robot 10, but it was not previously detected by, for example, a far-range sensor, then the event is recorded as a false negative. It will be understood that no detection of potentially moving objects 30 may also comprise a detection of potentially moving objects 30 but below a minimum certainty threshold.
[0197] That is, in the scenario depicted in
[0198] Using the previous mentioned example of the false negative detection at time t.sub.1, where the mobile robot 10 is at a pedestrian crossing 20 and an object 50 is on the road 20, it may be possible that the object 50 is a bus parked at a bust stop. Further, it may also be possible that the object 50 is out of the detecting range of the sensors 20. Therefore, it may be possible that in a first measurement at a time t.sub.1 no moving objects are detected. However, the object 50 may start moving and continue its trajectory towards the mobile robot 10 on the road 20, which may result in the detection of a potentially moving object in a second measurement at an observation time t.sub.2. This event may then be labelled by the mobile robot 10 as a false negative, as it presents the characteristics schematically depict in the
[0199] At a different time, for instance t.sub.2, which may be, e.g. after a couple of seconds, the mobile robot 10 may perform a second measurement using the same combination of sensors (or also different sensors) to collect a second sensor data set. It will be understood that the mobile robot 10 may also use a plurality of different sensors, for example, additional cameras, ultrasonic sensors, etc. I.e., the robot may use the same or other sensors (possibly other cameras, also ultrasonic sensors, radars, ToF or any combination thereof). Such combinations of sensors may be advantageous, as it may allow to increase the precision and recall of the passing car detector.
[0200] In the above, it has been described that sensor data based on measurements at an observation time t.sub.obs=t.sub.2 may be used by the robot 10 to validate sensor data based on measurements at a first observation time t.sub.1 preceding the observation time. However, it should be understood that this validation step may also be performed “offline”, i.e., not locally on the robot, but on a server to which the sensor data has been uploaded.
[0201] That is, in one embodiment, all moving objects, i.e. all passing cars, may be found at all time points for all the mobile robots 10 offline in the servers. Subsequently, it may possible to analyze if every usual car detector detection was true or not. Furthermore, it may be possible to analyze whether a moving object that was detected by some detector, for example, 2 s or 4 s before it actually passed in front of the mobile robot 10 on the traffic road, and is thus detected by the detector at the observation time t.sub.2=t.sub.obs.
[0202] In some instances, it may possible to determine the time, direction, speed, distance of the moving object, for example, it may be possible to determine that a moving object may be driving at 5 m/s, which may be useful to execute some loose matching, i.e. to estimate if the moving object detected in a first sensor measurement corresponds to the moving object detected in a second sensor measurement.
[0203] In some embodiments, it may also be possible to estimate the time to passing in front of the mobile robot 10, i.e. it may be possible to predict the time that a moving object may require to effectively cross in front of the mobile robot 10. It will be understood that this rationale may also be used in reverse. I.e., by determining when an object has passed (by using the validation sensor data), it may be possible to determine when this object should have been detected in a previous measurement. It may also be possible to encounter scenarios where no moving objects were detected in a first sensor measurement, but a moving objected passed in front of the mobile robot 10 a couple of seconds later, thus detected in a second sensor measurement. For these scenarios there may be an additional step and/or process to determine whether the detection was below a threshold to be qualified as such a detection and/or whether there was no detection at all. However, all detection may be recorded, even though no threshold criteria are met. Additionally or alternatively, manual annotations for systems testing, e.g. passing car detectors, may be included.
[0204] In
[0205] In simple words, the at least one sensor inbuilt in the mobile robot 10 may allow the collection of initial or raw sensor data, which after a data processing step, may permit inferring that the surroundings of current position of the mobile robot 10 is cleared of other traffic participants that may converge with the trajectory of the mobile 10, e.g. moving vehicles. In other words, the potentially scenario assigned as free of moving objects may provide the input to validate whether, for example, the traffic conditions of roads 20 may allow the mobile robot 10 to continue moving without modifying its planned trajectory, e.g. without stopping at the road 20. Moreover, when the sensors of the mobile 10, in a second measurement at time t.sub.2=t.sub.obs, detects no potentially moving object 30, the mobile 10 may confirm that the sensor data collected in t.sub.1 was containing a true information, i.e. data collecting no moving objects 30 are identified in a first sensor measurement (see
[0206] It will be understood that false negative detections (e.g., a non-detection when there is actually a car approaching—see, e.g.,
[0207] It will be understood that technologies to review and validate a hypothesis based on prediction sensor data (i.e., sensor data obtained at the first observation time t.sub.1) are desirable. To do that, in simple words, sensor data obtained at a later stage (i.e., at observation time t.sub.obs, also referred to as t.sub.2) may be used. It will be understood that the validation of the hypothesis based on sensors 200 may not be performed in real time on the mobile robot 10. Further, the detector 200 may use future sensor data to validate present sensor data, and/or present sensor data to validate past sensor data. In more simple words, the passing car detector may use second sensor data to validate first sensor data. In other words, the passing car detector may use the data collected in a second sensor measurement to validate the data collected in a first sensor measurement. Moreover, the car passing detectors may be useful for other purposes, such as, for example, testing the quality of far-range car detectors, and/or thresholds for detection of moving objects may be adjusted based on testing done with passing car detector. Further, the present technology can also be used for estimating traffic density.
[0208] In some embodiments, the validation of sensor data 202 may also be comprehensive data validation, i.e. other information regarding the detected objects may also be confirmed, such as, for example, a true positive of an event for a moving object with a speed of 40 km/h may be labelled, wherein the speed of the moving objects may also be recorded as a true positive. It will be understood that the comprehensive data validation may be extended to all parameters surveyed from the traffic environment 1000 via the sensors 200, and it will be also understood that the validation may also be performed only for a singular parameter and/or a combination of any survey data, e.g. either only validation of the presence of an detected moving object and/or the validation of a data set containing the presence of a detected moving object, its speed and directions of movement.
[0209]
[0210] At a later stage, when the truck is passing (see
[0211]
[0212] Put differently, the different road segments are identified as C1 and C2 and R1 to R5 represent different robots. D1 and D2 represent the results of a measurement based on the first observation time t.sub.1 and the second observation time t.sub.2=t.sub.obs. That is, the table of road segment C1 indicate that robots R2 to R5 have detected, e.g., a car in an initial measurement and that this detection was confirmed by the second measurement (all these results are thus true positives). However, robot R1 has not detected a car in the first measurement, but has detected the car in the second measurement (this result thus is a false negative).
[0213] That is, the mobile robots 10 (R1-R5) may subsequently survey the same road segments (C1 and C2), however, it may be possible that the data collected by all robots are matching in, for example, detecting a moving object (i.e. a true positive), except for one mobile robot 10, for example, R1, which may be reiteratively miss to identify a moving object in a first measurement D1, i.e. a false negative. Further, a corresponding pattern could also apply in the second road segment. If this type of repetitive events is compared and analyzed using a historical data set, it may allow to identify which mobile robot 10 has sensors not working optimally or with a malfunction. Furthermore, it may also be possible to trace down to individual sensors exhibiting the problem and further may be facilitating adjusting and solving the performance of the sensor. Such an approach may also be advantageous, as it may facilitate maintenance of sensors of mobile robots 10.
[0214] That is, the present technology may also allow to detect malfunctioning robots or sensors in robots. It will be understood that false negative detections may be caused by a plurality of circumstances. According to a first example, it may be impossible for a robot to detect an object if a stationary obstacle (such as a tree) is located between the robot and the object, as the obstacle may block the view onto the object. As a second example, weather conditions may lead to false negative results. Consider, e.g., the situation of dense fog. In such a situation it may be impossible for the robot to detect an approaching car at first observation time due to lack of visibility. In a third example, a sensor of a robot may malfunction and thus, it may not be possible to detect a car at a first observation time.
[0215] With regard to the above examples, it will be understood that the first two examples (stationary obstacles and low visibility due to fog) may impact different robots in the same way. Thus, in such scenarios, different robots would yield false negative results. However, in the third example (false negative due to malfunctioning detector), only the robot with the malfunctioning detector would yield false negative results. More particularly, this robot would consistently yield false negative results, i.e., not only for one road segment, but for different road segments. This may give rise to results as depicted in
[0216]
[0217] In a second step conceptually identified by reference numeral S2, the method may comprise, based on the measured first data 202, computing a finding or hypothesis regarding a presence of moving vehicles approaching the mobile robot 10 on the observed stretch of the traffic road, e.g. an object on the road 28. For instance, the mobile robot 10 may determine that the object 50 on the road 28 is moving at 50 km/h towards the road 24. Furthermore, the mobile robot 10 may also consider that the object 50 is of a given size range corresponding to, for example, a bus and/or a truck.
[0218] Subsequently, in a third step conceptually identified by reference numeral S3, the method may also comprise measuring second data, i.e., measuring validation data, relating to a road segment via at least one second sensor 200 of a mobile robot 10 at a second observation time t.sub.2 (also referred to as t.sub.obs and also referred to as validation observation time), wherein the time t.sub.2 is after time t.sub.1. It will be understood that the second sensor 200 may coincide with the first sensor, or may be different to the first sensor. In this second measurement, the mobile robot 10 may receive information that allows evaluating the hypothesis, such as, for example, a potentially moving object 50 may indeed be detected during the second measuring. Therefore, in a forth step conceptually identified by reference numeral S4, the method may further comprise, based on the measured second data, validating the accuracy of the computed hypothesis. For instance, the mobile robot 10 may record the information from the first measurement, after contrasting with the information contained in a second measurement, as a true positive, i.e. the potentially moving object 50 was in fact, for example, a bus moving towards the mobile robot 10 on the road 24.
[0219] Again, with general reference to
[0220] First, at an observation time t.sub.1 (cf.
[0221] Second, when detecting the passing vehicle at the observation time t.sub.obs=t.sub.2, it should be understood that also sensor data (such as images) obtained at shortly before and after may be used. E.g., when the center of the vehicle 50 is directly in front of the robot at time t.sub.2, also additional sensor data obtained, e.g., less than 1 s before and after this time t.sub.2 may be used to detect the passing vehicle. This may render the detection of the vehicle more reliable. In other words, for the detection of the passing vehicle “future data” from the passing car event can be used, i.e., this step can use data from left and right of the robot 10 which may render this step more reliable makes it lot more reliable.
[0222] Third, as discussed before, the calculation to validate the accuracy of the finding based on the data obtained at observation time t.sub.1 (corresponding to
[0223] Generally, it will be understood that in embodiments of the present technology, a large amount of data is collected and annotated. This data can be used to determine, for example, how far away vehicles (such as cars) can be seen at different locations on average, e.g., for each crossing, an average distance can be determined at which vehicles can be detected. Further, for some crossings it can be determined that they can be 2D mapped in a better way. E.g., based on the placements and tracks of the detected vehicles, it may be possible amend and fine-tune the information about the topology of roads and crossings i.e. the road map at a given location. Further still, the present technology may allow to determine that in some places occlusions occur more often than in other and the robots could thus avoid them. Also, the performance of different detector properties can be determined with these methods (e.g. a detector setup A can “see” cars at further away distance than detector setup B, but detector setup B can “see” cars with a higher probability at distance 20 m than detector setup A).
[0224] Finally, embodiments of the present technology can also be used to set detection thresholds for detection algorithms. That is, the annotated data (where data based on first sensor data is annotated based on validation sensor data) can be used to set thresholds for the detection algorithms transforming the first sensor data to first findings.
[0225] While in the above, preferred embodiments have been described with reference to the accompanying drawings, the skilled person will understand that these embodiments were provided for illustrative purpose only and should by no means be construed to limit the scope of the present invention, which is defined by the claims.
[0226] Whenever a relative term, such as “about”, “substantially” or “approximately” is used in this specification, such a term should also be construed to also include the exact term. That is, e.g., “substantially straight” should be construed to also include “(exactly) straight”.
[0227] Whenever steps were recited in the above or also in the appended claims, it should be noted that the order in which the steps are recited in this text may be accidental. That is, unless otherwise specified or unless clear to the skilled person, the order in which steps are recited may be accidental. That is, when the present document states, e.g., that a method comprises steps (A) and (B), this does not necessarily mean that step (A) precedes step (B), but it is also possible that step (A) is performed (at least partly) simultaneously with step (B) or that step (B) precedes step (A). Furthermore, when a step (X) is said to precede another step (Z), this does not imply that there is no step between steps (X) and (Z). That is, step (X) preceding step (Z) encompasses the situation that step (X) is performed directly before step (Z), but also the situation that (X) is performed before one or more steps (Y1), . . . , followed by step (Z). Corresponding considerations apply when terms like “after” or “before” are used.