SYSTEM FOR 3D SURVEYING BY AN AUTONOMOUS ROBOTIC VEHICLE USING LIDAR-SLAM AND AN ESTIMATED POINT DISTRIBUTION MAP FOR PATH PLANNING

20230064071 · 2023-03-02

Assignee

Inventors

Cpc classification

International classification

Abstract

A system for providing 3D surveying of an environment by an autonomous robotic vehicle comprising a SLAM unit for carrying out a simultaneous localization and mapping process, a path planning unit to determine a path to be taken by the autonomous robotic vehicle, and a lidar device. The lidar device is configured to generate the lidar data which allows the SLAM unit to receive the lidar data as part of the perception data for the SLAM process. The path planning unit is configured to determine the path to be taken by carrying out an evaluation of a further trajectory within a map of the environment in relation to an estimated point distribution map for an estimated 3D point cloud, which is provided by the lidar device on the further trajectory and projected onto the map of the environment.

Claims

1. A system for providing 3D surveying of an environment by an autonomous robotic vehicle, the system comprising: a simultaneous localization and mapping unit, SLAM unit, configured to carry out a simultaneous localization and mapping process, SLAM process, the SLAM process comprising reception of perception data providing a representation of the surroundings of the autonomous vehicle at a current position, use of the perception data to generate a map of an environment, and determination of a trajectory of a path that the autonomous vehicle has passed within the map of the environment, a path planning unit, configured to determine a path to be taken by the autonomous robotic vehicle based on the map of the environment, and a lidar device specifically foreseen to be mounted on the autonomous robotic vehicle and configured to generate lidar data to provide a coordinative scan of the environment relative to the lidar device, wherein the system is configured to generate the lidar data during a movement of the lidar device and to provide a referencing of the lidar data with respect to a common coordinate system for determining a 3D survey point cloud of the environment, wherein: the lidar device is configured to have a field-of-view of 360 degrees about a first axis and 130 degrees about a second axis perpendicular to the first axis and to generate the lidar data with a point acquisition rate of at least 300′000 points per second, the SLAM unit is configured to receive the lidar data as part of the perception data and, based thereof, to generate the map of the environment and to determine the trajectory of the path that the autonomous vehicle has passed within the map of the environment, and the path planning unit is configured to determine the path to be taken by carrying out an evaluation of a further trajectory within the map of the environment in relation to an estimated point distribution map for an estimated 3D point cloud, which is provided by the lidar device on the further trajectory and projected onto the map of the environment.

2. The system according to claim 1, wherein the lidar device is embodied as laser scanner, which is configured to generate the lidar data using a rotation of a laser beam about two rotation axes, wherein: the laser scanner comprises a rotating body configured to rotate about one of the two rotation axes and to provide for a variable deflection of an outgoing and a returning part of the laser beam, thereby providing a rotation of the laser beam about the one of the two rotation axes, fast axis, the rotating body is rotated about the fast axis with at least 50 Hz and the laser beam is rotated about the other of the two rotation axes, slow axis, with at least 0.5 Hz, the laser beam is emitted as pulsed laser beam, particularly wherein the pulsed laser beam comprises 1.5 million pulses per second, and for the rotation of the laser beam about the two axes the field-of-view about the fast axis is 130 degrees and about the slow axis 360 degrees.

3. The system according to claim 1, wherein the path planning unit is configured to receive an evaluation criterion defining different measurement specifications of the system, particularly different target values for the survey point cloud, and to take into account the evaluation criterion for the evaluation of the further trajectory, wherein the evaluation criterion defines at least one of: a desired path through the environment, a point density of the survey point cloud projected onto the map of the environment, particularly at least one of a minimum, a maximum, and a mean point density of the survey point cloud projected onto the map of the environment, an energy consumption threshold, particularly a maximum allowable energy consumption, for the system for completing the further trajectory and providing the survey point cloud, a time consumption threshold, particularly a maximum allowable time, for the system for completing the further trajectory and providing the survey point cloud, a path length threshold, particularly a minimal path length and/or a maximum allowable path length, of the further trajectory, a minimal area of the trajectory to be covered, a minimal spatial volume covered by the survey point cloud, and a minimum or maximum horizontal angle between a heading direction at the end of the trajectory of the path that the autonomous vehicle has passed and a heading direction at the beginning of the further trajectory.

4. The system according to claim 1, wherein the path planning unit is configured to receive a path of interest and is configured to optimize and/or extend the path of interest to determine the path to be taken.

5. The system according to claim 1, wherein: the system is configured to access identification information of a reference object and assignment data, wherein the assignment data provide for an assignment of the reference object to a trajectory specification within the vicinity of the reference object, particularly a further heading direction with respect to an outer coordinate system or with respect to a cardinal direction, the system comprises a reference object detector configured to use the identification information and, based thereof, to provide a detection of the reference object within the environment, and upon the detection of the reference object the path planning unit is configured to take into account the trajectory specification in the evaluation of the further trajectory, the system is configured to access a 3D reference model of the environment, wherein the trajectory specification is provided relative to the 3D reference model, particularly wherein the trajectory specification provides a planned path within the 3D reference model, the assignment data provide an assignment of the reference object to a position within the 3D reference model, and the system is configured to determine a frame transformation between the map of the environment and the 3D reference model by taking into account the assignment of the reference object to the position within the 3D reference model.

6. The system according to claim 1, wherein the system comprises: a fiducial marker configured to provide an indication of a local trajectory direction relative to the fiducial marker, particularly a visible mark providing for visual determination of the local trajectory direction, a fiducial marker detector configured to detect the fiducial marker and to determine the local trajectory direction, and the path planning unit is configured to take into account the local trajectory direction in the evaluation of the further trajectory.

7. The system according to claim 6, wherein the fiducial marker is configured to provide a, particularly visible, indication of the directions of at least two, particularly three, of the three main axes which span the common coordinate system, wherein the system is configured to determine the directions of the three main axes by using the fiducial marker detector, and the system is configured to take into account the directions of the three main axes for providing the referencing of the lidar data with respect to the common coordinate system.

8. The system according to claim 6, wherein the fiducial marker comprises a reference value indication, which provides positional information, particularly 3D coordinates, regarding a set pose of the fiducial marker in the common coordinate system or in an outer coordinate system, particularly a world-coordinate system, wherein the system is configured to derive the set pose and to take into account the set pose to determine the local trajectory direction, particularly by determining a pose of the fiducial marker in the common coordinate system or in the world coordinate system and carrying out a comparison of the determined pose of the fiducial marker and the set pose.

9. The system according to claim 6, wherein the fiducial marker is configured to provide an indication of a corresponding action to be carried out by the system, wherein the system is configured to determine the corresponding action by using the fiducial marker detector, particularly wherein the indication of the corresponding action is provided by a visible code, particularly a barcode, more particularly a matrix barcode, wherein the corresponding action is at least one of: a stop operation of the system, a pause operation of the system, a restart operation of the system, a return to an origin of a measurement task, an omission of entering an area in the vicinity of the fiducial marker, and a time-controlled entry into an area in the vicinity of the fiducial marker, wherein the path planning unit is configured to take into account the corresponding action in the evaluation of the further trajectory.

10. The system according to claim 6, wherein the fiducial marker comprises a visually detectable pattern, particularly provided by areas of different reflectivity, different gray scales and/or different colors, and the system is configured to determine a 3D orientation of the pattern by: determining geometric features in an intensity image of the pattern, wherein the intensity image of the pattern is acquired by a scanning of the pattern with a lidar measurement beam of the lidar device and a detection of an intensity of a returning lidar measurement beam, and carrying out a plane fit algorithm in order to determine an orientation of a pattern plane, by analyzing an appearance of the geometric features in the intensity image of the pattern.

11. The system according to claim 10, wherein: the pattern comprises a circular feature, the system is configured to identify an image of the circular feature within the intensity image of the pattern, and the plane fit algorithm is configured to fit an ellipse to the image of the circular feature and, based thereof, to determine the orientation of the pattern plane, wherein the center of the ellipse is determined and aiming information for aiming with the lidar measurement beam to the center of the ellipse are derived, wherein the pattern comprises inner geometric features, particularly comprising rectangular features, which are enclosed by the circular feature, more particularly wherein the inner geometric features are configured to provide the indication of the local trajectory direction and the system is configured to determine the local trajectory direction by analyzing the intensity image of the pattern and by taking into account the 3D orientation of the pattern.

12. The system according to claim 10, wherein the system is configured: to use the scanning of the pattern with the lidar device and the detection of the intensity of the returning lidar measurement beam to determine a first geometric shape of the pattern, to carry out a comparison of the first geometric shape with an expected shape of the pattern, particularly by taking into account the orientation of the pattern plane, more particularly the 3D orientation of the pattern, and, based thereof, to carry out an evaluation, particularly a determination, of an optical alignment of the optics of the lidar device, wherein the system comprises a camera specifically foreseen to be mounted on the autonomous robotic vehicle and configured to generate camera data during a movement of the camera, wherein the system is configured: to image the pattern by the camera and to determine a second geometric shape of the pattern, to carry out a comparison of the second geometric shape with the expected shape of the pattern, particularly by taking into account the orientation of the pattern plane, more particularly the 3D orientation of the pattern, and to take into account the comparison of the second geometric shape with the expected shape of the pattern in the evaluation, particularly the determination, of the optical alignment of the optics of the lidar device.

13. The system according to claim 12, wherein: the system is configured to carry out a system monitoring comprising a measurement of bumps and/or a vibration of the lidar device, and to automatically carry out the evaluation, particularly the determination, of the optical alignment of the optics of the lidar device as a function of the system monitoring.

14. The system according to claim 1, wherein the system comprises: an on-board computing unit specifically foreseen to be located on the autonomous robotic vehicle and configured to carry out at least part of a system processing, wherein the system processing comprises carrying out the SLAM process, providing the referencing of the lidar data, and carrying out the evaluation of the further trajectory, an external computing unit configured to carry out at least part of the system processing, a communication module configured to provide for a communication between the on-board computing unit and the external computing unit, and a workload selection module configured: to monitor an available bandwidth of the communication module for the communication between the on-board computing unit with the external computing unit, to monitor an available power of the on-board computing unit, the lidar device, the SLAM unit, and the path planning unit, and to dynamically change an assignment of at least part of the system processing to the on-board computing unit and the external computing unit depending on the available bandwidth and the available power assigned to the external processing unit.

15. The system according to claim 1, wherein the system is configured to receive an additional map of the environment generated by means of another SLAM process associated with another autonomous robotic vehicle, and the evaluation of the further trajectory takes into account the additional map of the environment by evaluating an estimated point distribution map for an estimated 3D point cloud provided by the lidar device on a trajectory segment of the further trajectory within the additional map of the environment and projected onto the additional map of the environment.

16. The system according to claim 2, wherein the path planning unit is configured to receive an evaluation criterion defining different measurement specifications of the system, particularly different target values for the survey point cloud, and to take into account the evaluation criterion for the evaluation of the further trajectory, wherein the evaluation criterion defines at least one of: a desired path through the environment, a point density of the survey point cloud projected onto the map of the environment, particularly at least one of a minimum, a maximum, and a mean point density of the survey point cloud projected onto the map of the environment, an energy consumption threshold, particularly a maximum allowable energy consumption, for the system for completing the further trajectory and providing the survey point cloud, a time consumption threshold, particularly a maximum allowable time, for the system for completing the further trajectory and providing the survey point cloud, a path length threshold, particularly a minimal path length and/or a maximum allowable path length, of the further trajectory, a minimal area of the trajectory to be covered, a minimal spatial volume covered by the survey point cloud, and a minimum or maximum horizontal angle between a heading direction at the end of the trajectory of the path that the autonomous vehicle has passed and a heading direction at the beginning of the further trajectory.

17. The system according to claim 7, wherein the fiducial marker comprises a reference value indication, which provides positional information, particularly 3D coordinates, regarding a set pose of the fiducial marker in the common coordinate system or in an outer coordinate system, particularly a world-coordinate system, wherein the system is configured to derive the set pose and to take into account the set pose to determine the local trajectory direction, particularly by determining a pose of the fiducial marker in the common coordinate system or in the world coordinate system and carrying out a comparison of the determined pose of the fiducial marker and the set pose.

18. The system according to claim 8, wherein the fiducial marker is configured to provide an indication of a corresponding action to be carried out by the system, wherein the system is configured to determine the corresponding action by using the fiducial marker detector, particularly wherein the indication of the corresponding action is provided by a visible code, particularly a barcode, more particularly a matrix barcode, wherein the corresponding action is at least one of: a stop operation of the system, a pause operation of the system, a restart operation of the system, a return to an origin of a measurement task, an omission of entering an area in the vicinity of the fiducial marker, and a time-controlled entry into an area in the vicinity of the fiducial marker, wherein the path planning unit is configured to take into account the corresponding action in the evaluation of the further trajectory.

19. The system according to claim 9, wherein the fiducial marker comprises a visually detectable pattern, particularly provided by areas of different reflectivity, different gray scales and/or different colors, and the system is configured to determine a 3D orientation of the pattern by: determining geometric features in an intensity image of the pattern, wherein the intensity image of the pattern is acquired by a scanning of the pattern with a lidar measurement beam of the lidar device and a detection of an intensity of a returning lidar measurement beam, and carrying out a plane fit algorithm in order to determine an orientation of a pattern plane, by analyzing an appearance of the geometric features in the intensity image of the pattern.

20. The system according to claim 11, wherein the system is configured: to use the scanning of the pattern with the lidar device and the detection of the intensity of the returning lidar measurement beam to determine a first geometric shape of the pattern, to carry out a comparison of the first geometric shape with an expected shape of the pattern, particularly by taking into account the orientation of the pattern plane, more particularly the 3D orientation of the pattern, and, based thereof, to carry out an evaluation, particularly a determination, of an optical alignment of the optics of the lidar device, wherein the system comprises a camera specifically foreseen to be mounted on the autonomous robotic vehicle and configured to generate camera data during a movement of the camera, wherein the system is configured: to image the pattern by the camera and to determine a second geometric shape of the pattern, to carry out a comparison of the second geometric shape with the expected shape of the pattern, particularly by taking into account the orientation of the pattern plane, more particularly the 3D orientation of the pattern, and to take into account the comparison of the second geometric shape with the expected shape of the pattern in the evaluation, particularly the determination, of the optical alignment of the optics of the lidar device.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

[0053] The system according to the different aspects is described or explained in more detail below, purely by way of example, with reference to working examples shown schematically in the drawing. Identical elements are labelled with the same reference numerals in the figures. The described embodiments are generally not shown true to scale and they are also not to be interpreted as limiting. Specifically,

[0054] FIG. 1: an exemplary embodiment of an autonomous robotic vehicle equipped with a lidar device;

[0055] FIG. 2: an exemplary workflow of using an autonomous robotic vehicle;

[0056] FIG. 3: an exemplary embodiment of the lidar device as two-axis laser scanner;

[0057] FIG. 4: exemplary communication schemes between different components of the system;

[0058] FIG. 5: an exemplary schematic of a path planning software making use of a reference object;

[0059] FIG. 6: an exemplary use of fiducial markers;

[0060] FIG. 7: an exemplary embodiment of a fiducial marker.

DETAILED DESCRIPTION

[0061] FIG. 1 depicts an example embodiment of an autonomous robotic vehicle equipped with a lidar device to be used in a system for providing 3D surveying. Here, the robotic vehicle 1 is embodied as a four-legged robot. For example, such robots are often used in unknown terrain with different surface properties having debris and steep inclines. The robot 1 has sensors 2 and processing capabilities to provide for simultaneous localization and mapping, which comprises reception of perception data providing a representation of the surroundings of the autonomous robot 1 at a current position, use of the perception data to generate a map of the environment, and determination of a trajectory of a path that the robot 1 has passed within the map of the environment.

[0062] According to one aspect, the robot is equipped with a lidar device 3, which has a field-of-view of 360 degrees about a vertical axis 4 and a vertical field-of-view 5 of at least 130 degrees about a horizontal axis (see FIG. 2), wherein the lidar device is configured to generate the lidar data with a point acquisition rate of at least 300′000 points per second. Here, the lidar device is exemplarily embodied as so-called two-axis laser scanner (see FIG. 2), wherein the vertical axis 4 is also referred to as slow axis and the horizontal axis is also referred to as fast axis.

[0063] The SLAM unit is configured to receive the lidar data as the perception data, which, for example, provides improved field-of-view and viewing distance and thus improved larger scale path determination. For example, this particularly beneficial for exploring unknown terrain. Another benefit comes with the all-around horizontal field-of-view about the vertical axis 4 and the vertical field-of-view 5 of 130 degrees about the horizontal axis, which provides the capability to essentially cover the front, the back, and the ground at the same time.

[0064] The system, comprising the legged robot 1 and the lidar device 3, further comprises a path planning unit, configured to carry out an evaluation of a further trajectory within the map of the environment in relation to an estimated point distribution map for an estimated 3D point cloud, which is provided by the lidar device 3 on the further trajectory and projected onto the map of the environment.

[0065] By way of example, a potential further trajectory is provided by an external source and the system is configured to optimize and/or extend (e.g. explore more rooms) the potential further trajectory, e.g. to provide a desired point distribution when generating lidar data on an optimized further trajectory. The further trajectory may also be determined “from scratch”, e.g. by using an algorithm configured to optimize distances to the walls and/or by implementing optimization principles based on so-called watertight probabilistic occupancy maps and dense maximum-likelihood occupancy voxel maps.

[0066] An exemplary workflow of using an autonomous robotic vehicle is depicted by FIG. 2, schematically showing a building floor to be surveyed, wherein a path of interest, e.g. a potential further trajectory is provided by a mobile reality capture device (top part of figure).

[0067] For example, the potential further trajectory is a recorded trajectory of a mobile surveying device which has previously measured the environment or a trajectory through setup points of a total station, e.g. wherein the total station includes a camera and a SLAM functionality to determine a movement path of the total station.

[0068] In the exemplary workflow depicted by the figure, a user walks through the building and thereby roughly surveys the environment by using a handheld mobile mapping device such as the BLK2GO reality capture device of Leica Geosystems, thereby defining the path of interest 30, i.e. the trajectory taken by the BLK2GO device.

[0069] As depicted in the bottom part of the figure, the autonomous robot then follows the path of interest 30 (post mission or live while the user is leading with the BLK2GO) on an optimized trajectory 31, which provides optimal point coverage of the lidar device, e.g. wherein distances to walls and objects within the environment are optimized and wherein open spaces and additional rooms along the path of interest are explored.

[0070] The optimized trajectory 31 includes sections associated with exploration areas 32, e.g. areas which have been omitted by the user or where inaccessible for the user when he was surveying the building with the mobile reality capture device. Other sections of the optimized trajectory 31 relate to rooms 33 where the user has chosen a bad trajectory to generate a desired quality of the point cloud. For example, the optimized trajectory 31 differs from the initially provided path of interest 30 in that an optimized trajectory is used to improve point density and room coverage by reducing hidden areas due to line-of-sight obstruction.

[0071] FIG. 3 shows an exemplary embodiment of the lidar device 3 of FIG. 1 in the form of a so-called two-axis laser scanner. The laser scanner comprises a base 7 and a support 8, the support 8 being rotatably mounted on the base 7 about the vertical axis 4. Often the rotation of the support 8 about the vertical axis 4 is also called azimuthal rotation, regardless of whether the laser scanner, or the vertical axis 4, is aligned exactly vertically.

[0072] The core of the laser scanner is an optical distance measuring unit 9 arranged in the support 8 and configured to perform a distance measurement by emitting a pulsed laser beam 10, e.g. wherein the pulsed laser beam comprises 1.5 million pulses per second, and by detecting returning parts of the pulsed laser beam by means of a receiving unit comprising a photosensitive sensor. Thus, a pulse echo is received from a backscattering surface point of the environment, wherein a distance to the surface point can be derived based on the time of flight, the shape, and/or the phase of the emitted pulse.

[0073] The scanning movement of the laser beam 10 is carried out by rotating the support 8 relative to the base 7 about the vertical axis 4 and by means of a rotating body 11, which is rotatably mounted on the support 8 and rotates about the horizontal axis 6. By way of example, both the transmitted laser beam 10 and the returning parts of the laser beam are deflected by means of a reflecting surface integral with the rotating body 11 or applied to the rotating body 11. Alternatively, the transmitted laser radiation is coming from the side facing away from the reflecting surface, i.e. coming from the inside of the rotating body 11, and emitted into the environment via a passage area within the reflecting surface (see below).

[0074] For the determination of the emission direction of the distance measuring beam 10 many different angle determining units are known in the prior art. For example, the emission direction may be detected by means of angle encoders, which are configured for the acquisition of angular data for the detection of absolute angular positions and/or relative angular changes of the support 8 or of the rotating body 11, respectively. Another possibility is to determine the angular positions of the support 8 or the rotating body 11, respectively, by only detecting full revolutions and using knowledge of the set rotation frequency.

[0075] A visualization of the data can be based on commonly known data processing steps and/or display options, e.g. wherein the acquired data is presented in the form of a 3D point cloud or wherein 3D vector file model is generated.

[0076] The laser scanner is configured to ensure a total field of view of the measuring operation of the laser scanner of 360 degrees in an azimuth direction defined by the rotation of the support 8 about the vertical axis 4 and at least 130 degrees in a declination direction defined by the rotation of the rotating body 11 about the horizontal axis 6. In other words, regardless of the azimuth angle of the support 8 about the vertical axis 4, the laser beam 10 can cover a vertical field of view 5 spread in the declination direction with a spread angle of at least 130 degrees.

[0077] By way of example, the total field of view typically refers to a central reference point 12 of the laser scanner defined by the intersection of the vertical axis 4 with the horizontal axis 6.

[0078] FIG. 4 exemplarily shows different communication and data processing schemes implemented by the different embodiments of the system.

[0079] Processing can take place on an on-board computing unit, e.g. a dedicated computing unit 13 specifically mounted for that purpose on the autonomous robot 1 or a computing unit provided by the robot 1 itself. Processing may also be executed by means of cloud computing 14 and on a companion device 15, e.g. a tablet.

[0080] For example, as depicted by the two schemes on the left part of the figure, a dedicated on-board computing unit 13 extends local computing capabilities, while at the same time the dedicated on-board computing unit 13 can be connected to a local operator companion device 15 for areas where the system has no connectivity (top left of the figure), or can serve as a cloud interface to the data cloud 14 in order to enable cloud computing (bottom left of the figure). Alternatively, the lidar device 3 is configured to carry out at least part of the processing, e.g. to calculate the trajectory, and to locally communicate with the companion device 15, which serves as cloud interface and/or carries out further processing steps (top right of the figure). The lidar device 3 may also be directly linked to the cloud 14 (bottom right of the figure), wherein processing is distributed dynamically by the cloud 14.

[0081] Switching between on-board computing, cloud processing, processing by the lidar device, and processing by the companion device is carried out dynamically as a function of connectivity between the computing locations and available power on the mobile robot 1. Typically, whenever possible processing is taken away from the mobile robot, e.g to the cloud and/or the companion device, because battery power and data storage of the mobile robot 1 and the devices located on the robot are limited.

[0082] FIG. 5 schematically depicts a path planning software making use of a reference object 16. Only one reference object 16 is required for the principle to work. However, multiple reference objects 16 allow for a more accurate localization.

[0083] A reference object 16 is virtually introduced in a planning software 17 comprising a digital model 18 of the environment, e.g. a CAD model. A physical counterpart 19 to that virtual reference object 16, e.g. in the form of a matrix barcode, is generated and placed in the real environment. In the planning software a further path 20 within the digital model 18 is associated with the virtual reference object 16 such that control data for the robot 1 can be derived therefrom, which allow to localize the robot 1 in the real environment and to instruct the robot 1 to follow the further path 20 in the real environment. Thus, a planned path can serve as input to the robot control software 21.

[0084] For example, upon visual detection of the real reference object 19, here in the form of a matrix barcode, the path planning unit associates the detected reference object 19 with the predefined further path 20 and uses predefined trajectory information as input for the evaluation of the further trajectory, e.g. by performing a frame transformation between the real world and the “planned world”.

[0085] FIG. 6 schematically depicts an exemplary use of fiducial markers. By way of example, the robot is provided with a known list of fiducial markers 22 with corresponding actions. If a fiducial marker 22 is detected, the corresponding action is triggered. For example, an operator loads a particular fiducial marker 220 in the form of a matrix barcode on a smartphone 23, wherein the particular fiducial marker 220 is associated with stopping or pausing the system for a battery change. The operator presents the marker 220 to a camera mounted on the robot 1, wherein the system has access to a data cloud 14, which provides association of different actions 24 to each of the list of fiducial markers 22, which includes the particular fiducial marker 220. The operator is thus able to control the system at least to some degree, without actually having access to the control of the robot 1. By way of example, this may be useful for emergency situations, in that a non-operator is allowed to interact with the robot, e.g. to prevent actions, damage, etc.

[0086] Alternatively, or in addition, another particular fiducial marker 221 is fixedly placed within the environment, e.g. to make sure the robot 1 does not enter a particular area. Further markers (not shown) may be used as encoded survey control points (combined targets). Other markers may provide time-gated rules and actions such as “do not enter between 10 am-11 am”.

[0087] FIG. 7 depicts an exemplary embodiment of a fiducial marker 40. On the left, the fiducial marker 40 is shown in a frontal view. On the right, the fiducial marker 40 is shown in an angled view.

[0088] The fiducial marker comprises a visually detectable pattern, e.g. provided by areas of different reflectivity, different gray scales and/or different colors. The pattern comprises a circular feature 41 and inner geometric features 42, which are enclosed by the circular feature 41.

[0089] By way of example, the system is configured to determine the 6DoF (six degrees of freedom) pose of the fiducial marker. The 6DoF pose is derived by determining a 3D orientation of the pattern, i.e. a 3D orientation of a pattern plane, and by determining a 3D position of the pattern. For example, marker corners 43 (at least three) are analyzed to provide for determination of an angle of the pattern plane. The marker corners 43 may be determined using a camera on the UGV or the UAV, respectively.

[0090] The circular feature 41 provides for improved determination of the 3D orientation of the pattern plane. By way of example, the system is configured to generate an intensity image of the pattern by a scanning of the pattern with a lidar measurement beam of the lidar device, wherein the intensity image is generated by detection of an intensity of a returning lidar measurement beam. By identifying the image of the circular feature within the intensity image of the pattern and running a plane fit algorithm to fit an ellipse to the image of the circular feature the 3D orientation of the pattern plane is determined with improved precision. In addition, the center of the ellipse may be determined and used as aiming point for the lidar device to determine the 3D position of the pattern, thereby allowing to determine the 6DoF pose of the pattern.

[0091] The 3D orientation of the pattern, particularly the 6DoF pose of the pattern, are then taken into account for determining the local trajectory direction and/or in the evaluation of the further trajectory. For example, the 6DoF pose is taken into account for providing improved referencing of the lidar data with respect to the common coordinate system.

[0092] Although aspects are illustrated above, partly with reference to some preferred embodiments, it must be understood that numerous modifications and combinations of different features of the embodiments can be made. All of these modifications lie within the scope of the appended claims.