REALITY CAPTURE WITH A LASER SCANNER AND A CAMERA

20220373685 · 2022-11-24

Assignee

Inventors

Cpc classification

International classification

Abstract

A reality capture device for generating a digital three-dimensional representation of an environment enables an object within an infrastructure to be surveyed or detected. The reality capture device is compact and easy to use, allowing for fast and reliable capture. The reality capture device can be carried and moved by a mobile carrier, particularly a person, robot or vehicle, and can be moved during a measuring process for generating a digital representation of an environment. The mobile reality capture device includes a localization unit for providing a simultaneous localization and mapping functionality, a laser scanner, and a camera unit. The mobile reality capture device is configured to be carried by a user through the room. The room is surveyed during the movement of the mobile reality capture device. The data from the laser scanner and the camera unit are referenced to each other by means of the localization unit.

Claims

1-232. (canceled)

233. Mobile reality capture device configured to be carried and moved by a mobile carrier, particularly a person or a robot or a vehicle, and to be moved during a measuring process for generating a digital representation of an environment, comprising: a localization unit, particularly comprising an inertial measurement unit (IMU), the localization unit being configured for generating localization data for determining a trajectory of the mobile reality capture device, a laser scanner configured to carry out, during movement of the mobile reality capture device, a scanning movement of a laser measurement beam relative to two rotation axes, and, based thereof, to generate light detection and ranging (LIDAR) data for generating a three-dimensional point cloud, a base supporting the laser scanner, and a cover, particularly a cover which is opaque for visible light, mounted on the base such that the cover and the base encase all moving parts of the laser scanner, such that from the outside no moving parts are touchable.

234. Mobile reality capture device according to claim 233, wherein: the localization unit has an inertial measurement unit (IMU) for generating inertial data for the mobile reality capture device, the IMU comprising two inertial sensors, one of the inertial sensors is mounted on a part of the laser scanner, which rotates during the measuring process, the other one of the inertial sensors is mounted on a part of the laser scanner which is static relative to the base during the measuring process, and the localization unit is configured to determine a drift in the inertial data for the mobile reality capture device by comparing data of the two inertial sensors, taking into account a rotation parameter describing the relative rotation between the two inertial sensors.

235. Mobile reality capture device according to claim 233, wherein the localization unit is configured that the localization data are based on at least part of the LIDAR data, and the mobile reality capture device is configured for carrying out a LIDAR-based localization and mapping algorithm.

236. Mobile reality capture device according to claim 233, wherein the mobile reality capture device comprises a panoramic camera unit arranged on a lateral surface of the mobile reality capture device, the lateral surface defining a standing axis of the mobile reality capture device, namely wherein the lateral surface is circumferentially arranged around the standing axis, and the panoramic camera unit is configured to provide for image data which cover a visual field of at least 120° around the standing axis, particularly at least 180°, more particularly 360°.

237. Mobile reality capture device according to claim 233, wherein the mobile reality capture device comprises a localization camera for being used by the localization unit, particularly wherein the localization camera is part of the panoramic camera unit, the localization unit is configured that the localization data are based on image data generated by the localization camera.

238. Mobile reality capture device according to claim 237, wherein the mobile reality capture device comprises multiple localization cameras for being used by the localization unit, the multiple localization cameras are configured and arranged that, for a nominal minimum operating range of the localization unit, each of the multiple localization cameras has a field of view overlap with at least another one of the multiple localization cameras.

239. Mobile reality capture device according to claim 233, wherein the mobile reality capture device comprises a color camera configured to capture color images, the mobile reality capture device is configured to provide point cloud data for generating a colored three-dimensional point cloud based on the LIDAR data and the color images.

240. Mobile reality capture device according to claim 233, wherein the laser scanner is configured that for generating the LIDAR data the two rotation axes rotate faster than 0.1 Hz, particularly faster than 1 Hz, the LIDAR data are generated with a point acquisition rate of at least 300′000 points per second, particularly at least 500′000 points per second.

241. Mobile reality capture device according to claim 233, wherein the cover provides a field of view of the laser scanner which is larger than half of a unit sphere around the laser scanner.

242. Mobile reality capture device according to claim 241, wherein the cover has a hemispherical head part, which merges in the direction of the base in a cylindrical shell, the laser scanner is configured that the LIDAR data are generated based on an orientation of the laser measurement beam where it passes through the hemispherical head part and an orientation of the laser measurement beam where it passes through the cylindrical shell.

243. Mobile reality capture device according to claim 233, wherein the cover is made of a material comprising plastic, wherein the cover has an atomic layer deposition (ALD) coating on the outside and on the inside, the ALD coating on the outside and/or the inside is covered by a hard coating.

244. Mobile reality capture device according to claim 233, wherein the cover has an anti-reflex (AR) coating on the inside and/or on the outside, and wherein the cover has on the inside and/or on the outside an area, which is free of the AR coating, the AR coating is applied on an inside circumferential band, which covers a limited elevation range.

245. Mobile reality capture device according to claim 233, wherein the cover has a hemispherical head part, the hemispherical head part comprises a planar area with a planar surface both on the outside and the inside, wherein the planar area is arranged at zenith.

246. Mobile reality capture device according to claim 245, wherein the planar area is specifically foreseen for mounting an additional sensor, particularly a global navigation satellite system (GNSS) transceiver, or wherein the planar area is specifically foreseen for providing a zenith LIDAR measurement by the laser scanner.

247. Mobile reality capture device according to claim 233, wherein the localization unit is configured to determine the trajectory with six degrees of freedom, namely involving position and orientation of the mobile reality capture device, the mobile reality capture device is configured for simultaneous localization and mapping (SLAM) to generate a three-dimensional map by involving data of the IMU, image data of the camera unit for visual simultaneous localization and mapping (VSLAM), and LIDAR data for LIDAR based simultaneous localization and mapping (LIDAR-SLAM).

248. Mobile reality capture device according to claim 233, wherein the laser scanner comprises: a support, mounted on the base and being rotatable relative to the base, and a rotating body for deflecting the outgoing laser measurement beam and returning parts of the laser measurement beam, the rotating body being mounted on the support and being rotatable relative to the support, wherein the generation of the LIDAR data comprises: a continuous rotation of the support relative to the base and a continuous rotation of the rotating body relative to the support, and emission of the laser measurement beam via the rotating body, which continuously rotates, and detection of parts of the laser measurement beam returning via the rotating body.

249. Mobile reality capture device according to claim 248, wherein the laser scanner is configured that the continuous rotation of the rotating body relative to the support is faster than the continuous rotation of the support relative to the base, the continuous rotation of the support is at least 0.1 Hz and the continuous rotation of the rotating body is at least 50 Hz.

250. Mobile reality capture device configured to be carried and moved by a mobile carrier, particularly a person or a robot or a vehicle, and to be moved during a measuring process for generating a digital representation of an environment, comprising: a localization unit, particularly comprising an inertial measurement unit (IMU), the localization unit being configured for generating localization data for determining a trajectory of the mobile reality capture device, a laser scanner configured to carry out, during movement of the mobile reality capture device, a scanning movement of a laser measurement beam relative to two rotation axes, and, based thereof, to generate light detection and ranging (LIDAR) data for generating a three-dimensional point cloud, and a camera unit arranged on a lateral surface of the mobile reality capture device, the lateral surface defining a standing axis of the mobile reality capture device, wherein the lateral surface is circumferentially arranged around the standing axis, wherein the camera unit is configured to provide for image data which cover a visual field of more than 180° around the standing axis, particularly 360°.

251. Mobile reality capture device according to claim 250, wherein the laser scanner is configured that for generating the LIDAR data the two rotation axes rotate faster than 0.1 Hz, particularly faster than 1 Hz, wherein the LIDAR data are generated with a point acquisition rate of at least 300′000 points per second, particularly at least 500′000 points per second.

252. Mobile reality capture device according to claim 251, wherein the laser scanner comprises: a support, mounted on the base and being rotatable relative to the base, and a rotating body for deflecting the outgoing laser measurement beam and returning parts of the laser measurement beam, the rotating body being mounted on the support and being rotatable relative to the support, wherein the generation of the LIDAR data comprises: a continuous rotation of the support relative to the base and a continuous rotation of the rotating body relative to the support, and emission of the laser measurement beam via the rotating body, which continuously rotates, and detection of parts of the laser measurement beam returning via the rotating body.

253. Mobile reality capture device according to claim 252, wherein the laser scanner is configured that the continuous rotation of the rotating body relative to the support is faster than the continuous rotation of the support relative to the base, the continuous rotation of the support is at least 0.1 Hz and the continuous rotation of the rotating body is at least 50 Hz.

254. Mobile reality capture device according to claim 253, wherein the camera unit comprises multiple cameras circumferentially arranged on the lateral surface and the mobile reality capture device is configured to generate from the image data a panoramic image, namely wherein individual images of the multiple cameras are stitched together to form an image having a wider field of view than the individual images.

255. Mobile reality capture device according to claim 253, wherein the mobile reality capture device is configured for simultaneous localization and mapping (SLAM) to generate a three-dimensional map of the environment by involving: data of the IMU (IMU-SLAM), image data of the camera unit for visual simultaneous localization and mapping (VSLAM), and LIDAR data for LIDAR based simultaneous localization and mapping (LIDAR-SLAM).

256. Mobile reality capture device according to claim 253, wherein the mobile reality capture device is configured to generate a three-dimensional vector file model of the environment, particularly a mesh.

257. Mobile reality capture device according to claim 253, wherein the mobile reality capture device comprises an attachment unit for attaching an accessory device to the mobile reality capture device, wherein the attachment unit has: a fixing unit with a receptacle, configured to receive a counterpart to the receptacle and to secure the counterpart in the receptacle, particularly in a way which maintains a, particularly predetermined, orientation of the counterpart relative to the receptacle, and a wireless data bus, configured to provide for one or bi-directional data transfer between the accessory device and the mobile reality capture device.

258. Mobile reality capture device according to claim 257, wherein the fixing unit comprises at least one of a magnet, a part of a hook and loop fastener, a female or male part of a plug-in connection, and a clamp.

259. Mobile reality capture device according to claim 257, wherein the attachment unit has an inductive power exchange unit, configured to provide power supply from the mobile reality capture device to an accessory device, which is secured by the fixing unit, and/or from the secured accessory device to the mobile reality capture device.

260. Mobile reality capture device according to claim 257, wherein the mobile reality capture device comprises a sensing unit, configured to detect an accessory device within reach for wireless data transfer by the wireless data bus, and to activate the wireless data bus for starting the data transfer upon detection of the accessory device within reach, and/or to detect that an accessory device is secured by the fixing unit, and to activate the inductive power exchange unit for starting the power exchange upon detection of the secured accessory device.

Description

BRIEF DESCRIPTION OF THE DRAWINGS

[0268] The aspects of the invention are described or explained in more detail below, purely by way of example, with reference to working examples shown schematically in the drawing. Identical elements are labelled with the same reference numerals in the figures. The described embodiments are generally not shown true to scale and they are also not to be interpreted as limiting the invention. Specifically,

[0269] FIG. 1: an exemplary application of the mobile reality capture device in building surveying;

[0270] FIG. 2: an exemplary embodiment of a mobile reality capture device according to the invention;

[0271] FIG. 3 an exemplary embodiment of a laser scanner to be used within the mobile reality capture device of FIG. 2;

[0272] FIG. 4 an exemplary embodiment of a rotating body, configured that the outgoing laser measurement beam coming from the inside of the rotating body is sent into the environment through a passage area;

[0273] FIG. 5a cross section through the rotating body of FIG. 4 without a prism;

[0274] FIG. 6a cross section through the rotary body from FIG. 4 with inserted prism;

[0275] FIG. 7a cross section through a rotary body configured to hold a round cylinder prism;

[0276] FIG. 8 an exemplary shape of a cover for the laser scanner of FIG. 3;

[0277] FIG. 9 an exemplary embodiment of multiple light indicators, wherein each of the light indicators is assigned to a scan section fixed relative to the mobile reality capture device;

[0278] FIG. 10 detection of areas where additional data are required, e.g. to provide guidance to the detected areas by means of the light indicators depicted by FIG. 9;

[0279] FIG. 11 an exemplary embodiment of a laser scanner comprising a referencing element having a curved surface;

[0280] FIG. 12 an exemplary embodiment of a cooling system having a first area, which is free of rotating parts, and a second area comprising rotating parts for a scanning movement of a laser measurement beam, wherein the cooling system has an air entrance to let in external air into the first area;

[0281] FIG. 13 control of data storage by taking into account an evaluation of a geometric relationship between an acquisition position and an area to be probed from the acquisition position;

[0282] FIG. 14 re-initialization of a SLAM unit by recalling a relative positional relationship between SLAM features and position of the mobile reality capture device along the previous trajectory;

[0283] FIG. 15 system comprising a mobile reality capture device and a companion device, which are configured to establish a server-client communication;

[0284] FIG. 16 an exemplary application of a reality capture device as monitoring device, here for monitoring of a subway station;

[0285] FIG. 17 an exemplary embodiment of a monitoring device according to the invention;

[0286] FIG. 18 an exemplary embodiment of receptacle for attaching the mobile reality capture device to an additional component;

[0287] FIG. 19 determination of a signal strength of a radio signal, which is available along the trajectory of the mobile reality capture device for determining a heat map indicating a classification of the environment into different radio signal reception areas.

DETAILED DESCRIPTION

[0288] FIG. 1 shows an exemplary application of the mobile reality capture device 1 in the field of architecture or real estate, e.g. wherein an architect or a potential homebuyer would like to have a 3D model of a room or the entire building for providing improved visualization of details or potential extension plans.

[0289] The mobile reality capture device 1 comprises a localization unit, e.g. for the purposes of providing a simultaneous localization and mapping functionality, a laser scanner, and a camera unit, wherein the mobile reality capture device is configured to be carried by a user through the room. The room is surveyed during the movement of the mobile reality capture device, wherein the data from the laser scanner and the camera unit are referenced to each other by means of the localization unit, e.g. within the scope of a SLAM functionality.

[0290] Thanks to the movement of the user, objects and spatial area can be measured from different angles, as a result of which, shadowing and/or dead angles can be avoided.

[0291] The laser scanner is configured to scan the room by means of a laser measurement beam 2, which is moved relative to two rotation axes, e.g. a vertical axis of rotation 3, often referred to as the “slow” axis or azimuth axis, and a rotation axis 4 perpendicular thereto, often also referred to as the “fast” axis or elevation axis.

[0292] By way of example, a desired point-to-point resolution is adjustable by adjusting the pulse rate of the laser measurement beam 2 and/or by adjusting the rotational speed about the two rotation axes, wherein often the rotation about the elevation axis 4 is set higher than the rotation about the azimuthal axis 3.

[0293] The cameras of a camera unit (see below) are arranged on a lateral surface of the mobile reality capture device, the lateral surface defining a standing axis 5 of the mobile reality capture device, wherein the lateral surface is circumferentially arranged around the standing axis. For example, the camera unit is configured to provide for image data which, essentially instantaneously, cover a visual field of more than 180° around the standing axis 5, particularly 360°.

[0294] By way of example, the camera unit comprises one or multiple cameras foreseen to be used in a visual SLAM functionality, one or multiple color cameras, e.g. for colorizing a three-dimensional point cloud, one or multiple high resolution cameras, e.g. for providing a high-resolution detail image, one or multiple high dynamic range (HDR) cameras, e.g. single exposure HDR cameras, one or multiple multispectral, particularly hyperspectral, cameras, e.g. for identification of surface properties or for differentiating different kinds of surfaces, and one or multiple thermal cameras, e.g. for providing temperature information.

[0295] The mobile reality capture device 1 may further include other sensors or have additional auxiliary device interfaces, e.g. an interface for attaching a GNSS rover or a display.

[0296] In particular, the mobile reality capture device 1 is configured to communicate with an external processing unit of a companion device, e.g. a computer, tablet or smartphone, which is configured to process at least parts of the measurement data of the reality capture device 1, e.g. for referencing the camera data with the laser scanner data or for providing extended display functionality.

[0297] In particular, the reality capture device 1 is configured to transmit measurement data to the external processing unit by means of a data streaming started simultaneously or at least close in time relative to the measurement process, e.g. via WLAN or Bluetooth connection, so that the processing of the measurement data on the external processing unit can take place essentially parallel to the data acquisition. For example, this way the measurement data can be displayed continuously for a user as a continuously growing colored 3D point cloud, e.g. by means of a display coupled to the mobile reality capture device 1.

[0298] By way of example, the localization unit is configured to determine a trajectory of the mobile reality capture device 1 with six degrees of freedom, i.e. involving position and orientation (pose) of the mobile reality capture device. In particular, the mobile reality capture device 1 may be configured for simultaneous localization and mapping (SLAM) to generate a three-dimensional map by involving at least one of data of an inertial measurement unit (IMU-SLAM), image data of the camera unit for visual SLAM (VSLAM), and light detection and ranging (LIDAR) data of the laser scanner for LIDAR based SLAM mapping (LIDAR-SLAM).

[0299] In addition to the localization unit, the reality capture device 1 may be additionally provided with a positioning unit such as a global satellite navigation system transceiver or a compass, e.g. for referencing the data of the mobile reality capture device with a global coordinate system.

[0300] FIG. 2 shows an exemplary embodiment of a mobile reality capture device, comprising a laser scanner 6 and a camera unit with a plurality of cameras 7.

[0301] The laser scanner 6 has a cover 8, which is opaque for visible light but optically transmissive for the laser measurement beam. Together with a base of the laser scanner the cover 8 forms a closed housing which is stationary with respect to the mobile reality capture device, wherein all moving parts of the laser scanner 6 are enclosed by the housing.

[0302] By way of example, the mobile reality capture device is configured to require only a minimum number of controls integrated into the device. For example, the device has only a single integrated control element 9, which has an activated and an inactive state and is switchable via an external action to assume the activated or the inactive state.

[0303] For example, individual measurement programs and/or actions of the reality capture device can be triggered by at least one of: a change of the state of the control element 9 from the inactive to the active state, a change of the state of the control element 9 from the active to the inactive state, a switching of the control element 9 by means of a lasting external effect during a defined period of time (e.g. continuous pressing of a control button), an encoded sequence of state changes of the control element 9 between the active and inactive state, and a coded sequence of temporally lasting external effects on the control element 9 over defined periods of time. Such measurement programs or actions may include at least one of: activating/deactivating the laser scanner 6, starting a defined measuring process, or interrupting/canceling and restarting the measuring process.

[0304] The mobile reality capture device can also be configured such that defined measurement programs and actions are stored on the device and/or that new measurement programs and actions can be defined by the user, e.g. via a corresponding input functionality for assigning commands to the states and/or state changes of the control element 9.

[0305] By way of example, the mobile reality capture device further comprises, a light indicator 10, e.g. for indicating a device status in such a way that the status indication looks uniform in all azimuthal directions around the standing axis of the reality capture device. Furthermore, the light indicator 10 may be configured to provide guiding instructions (see below).

[0306] FIG. 3 shows a close-up of the laser scanner 6 from FIG. 2, comprising a base 11 and a support 12, the support 12 being rotatably mounted on the base 11 about a support rotation axis 3. Often the rotation of the support 12 about the support rotation axis 3 is also called azimuthal rotation, regardless of whether the laser scanner, or the support rotation axis 3, is aligned exactly vertically.

[0307] The core of the laser scanner 6 is an optical distance measuring unit 13 arranged in the support 12 and configured to perform a distance measurement by emitting a transmission radiation 2, e.g. pulsed laser radiation, and by detecting returning parts of the transmission radiation by means of a receiving unit comprising a photosensitive sensor. Thus, a pulse echo is received from a backscattering surface point of the environment, wherein a distance to the surface point can be derived based on the time of flight, the shape, and/or the phase of the emitted pulse.

[0308] In the embodiment shown, the scanning movement of the laser measurement beam about the two rotation axes 3,4 is carried out by rotating the support 12 relative to the base 11 about the support rotation axis 3 and by means of a rotating body 14, which is rotatably mounted on the support 12 and rotates about a beam rotation axis 4.

[0309] By way of example, both the transmission radiation 2 and the returning parts of the transmission radiation are deflected by means of a reflecting surface 15 integral with the rotating body 14 or applied to the rotating body 14.

[0310] Alternatively, one aspect of the invention relates to the fact that the transmission radiation is coming from the side facing away from the reflecting surface 15, i.e. coming from the inside of the rotating body 14, and emitted into the environment via a passage area within the reflecting surface (see below).

[0311] For the determination of the emission direction of the distance measuring beam 2 many different angle determining units are known in the prior art. For example, the emission direction may be detected by means of angle encoders, which are configured for the acquisition of angular data for the detection of absolute angular positions and/or relative angular changes of the support 12 about the support rotation axis 3, or of the rotating body 14 about the beam rotation axis 4, respectively. Another possibility is to determine the angular positions of the support 12 or the rotating body, respectively, by only detecting full revolutions and using knowledge of the set rotation frequency.

[0312] A visualization of the data can be based on commonly known data processing steps and/or display options, e.g. wherein the acquired data is presented in the form of a 3D point cloud or wherein 3D vector file model is generated.

[0313] FIG. 4 shows an exemplary embodiment of a rotating body 14, which is attached to the support 12 and configured for the deflection of the transmission radiation 2 or parts of the transmission radiation returning from the environment about the beam rotation axis.

[0314] The rotating body 14 has a passage area 16 for the transmitting radiation 2 arranged in the reflecting surface. The transmission radiation 2 is coming from the side facing away from the reflecting surface 15, i.e. coming from the inside of the rotating body 14.

[0315] The parts 17 of the transmission radiation coming back from the environment are deflected by the reflecting surface 15 towards an optical window and forwarded, e.g. by additional stationary deflecting means 18, to a receiver of the optical measuring unit.

[0316] For example, this so-called “backward injection of the transmission radiation” has the advantage of enabling a compact design of the laser scanner.

[0317] By way of example, the deflection of the transmission radiation 2 is effected by means of a prism 19 arranged in the rotating body 14 and co-rotating with the rotating body, wherein the prism deflects the transmission radiation away from the beam rotation axis, in particular—as in the embodiment shown—to a direction perpendicular to the beam rotation axis.

[0318] The rotating body 14 may particularly be configured that the transmitting radiation 2 is emitted by the laser source 20 into a free-beam section 21 within the rotating body 14, e.g. directly or by means of an optical fiber, particularly wherein further additional optics 22 can be arranged within the rotating body 14.

[0319] FIGS. 5 and 6 show a section of a cross section through the rotating body 14 of FIG. 4, comprising an exemplary embodiment of a receptacle 23 formed along the beam rotation axis 4, wherein the receptacle 23 is configured for receiving a prism 19 as deflection component for the transmission radiation 2. FIG. 5 shows the receptacle 23 without prism 19 and FIG. 6 shows the receptacle 23 with built-prism 19.

[0320] By way of example, the receptacle 23 has a substantially cuboidal shape for receiving a cuboid prism 19. The cuboid edges define a first 24 and a second 25 stabilization plane. In the embodiment shown, the first stabilization plane 24 has a first 26 and a second 27 contact surface, and the second stabilization plane 25 has a third contact surface 28 and a redundant surface 29 (not used for stabilization of the prism), wherein a recess 30 is formed between the first 26 and second 27 contact surface and between the third contact surface 28 and the redundant surface 29. In this case, the first contact surface 26 adjoins the third contact surface 28, whereby a corner 31 is formed, here a 90 degree corner.

[0321] In particular, the prism 19 may have a facet, which frees the corner 31, wherein the prism is pressed by means of a fixing component 32, typically via an intermediate roll over component 33, in the direction of the corner 31, so that the forces on the three contact surfaces 26,27,28 are essentially equal.

[0322] A particular aspect of the invention relates to the production of the rotating body 14 or the receptacle 23, respectively, in particular in the light of a compact design of the rotating body 14 and the laser scanner as a whole.

[0323] For example, the rotary body 14 is formed by means of a lathe as a solid of revolution, wherein subsequently the recesses 30 of the receptacle 23 are made, e.g. by means of drilling, planing or milling. In particular, the so formed rotating body 14 may have a receptacle 23 for a deflection component, e.g. a prism 19, wherein the receptacle 23—as shown in the image—at the same time has a mirror axis, which is arranged coaxially to the beam rotation axis 4.

[0324] Alternatively, the rotating body 14 and the receptacle 23, particularly in case that large quantities have to be produced, may be formed by means of an injection molding process.

[0325] FIG. 7 shows an alternative to the use of a cuboid prism as shown by FIG. 5 and FIG. 6, wherein instead a round cylinder prism 19′ is used as deflection component for the transmission radiation within the rotating body.

[0326] Here, the receptacle has a rounded pressing region 31′ and a countering mechanism, e.g. two screw holes 34, for turning and fixing the received prism 19′, again by means of a fixing component 32 and an intermediate roll over component 33.

[0327] FIG. 8 shows an exemplary shape of a cover for the laser scanner from FIG. 3 according to a further aspect of the invention. The cover is rotationally symmetrical with respect to the support rotation axis 3 and can be mounted on a substructure (not shown), wherein together with the substructure the cover forms an enclosure, which is resting with respect to the base, of the support 12 and the rotating body 14 (FIG. 3, FIG. 4).

[0328] The cover has a substantially hemispherical head part 35, which merges in the direction of the substructure into a cylindrical shell 36.

[0329] The cover and substructure are configured to ensure a total field of view of the measuring operation of the laser scanner 6 of 360 degrees in an azimuth direction defined by the rotation of the support 12 about the support rotation axis 3 and at least 120 degrees in a declination direction defined by the rotation of the rotating body 14 about the beam rotation axis 4. In other words, the cover and the substructure are configured such that, regardless of the azimuth angle of the support 12 about the support rotation axis 3, the transmission radiation 2 can cover a vertical field of view 38 spread in the declination direction with a spread angle of at least 120 degrees.

[0330] By way of example, the total field of view typically refers to a central reference point 39 of the laser scanner 6 defined by the intersection of the support rotation axis 3 with the beam rotation axis 4.

[0331] In the context of the measuring process, distance measurements thus take place both based on transmitting radiation 2 passing through the head part 35 and on transmitting radiation 2 passing through the cylindrical shell 36.

[0332] The cross section of the boundary of the cover, in particular the thickness of the boundary as a function of the angle at which the transmission radiation 2 passes through the cover, starting from the reference point 39, affects the beam shape of the transmission radiation. Accordingly, the curvature of the cover may lead to a defocusing of the transmission beam 2 and thus to an increase in the focal length of the laser scanner 6. Accordingly, the boundary is optimized, such that spline based correction parameters can be stored on the laser scanner to correct the beam deflection caused by the cover.

[0333] For example, the boundary in the head part 35 has a substantially constant thickness, which is reduced in the cylindrical shell 35 in the direction of the substructure.

[0334] Furthermore, the cover may have, for example, a special optical coating, in particular an anti-reflex coating applied by atomic layer deposition (ALD) and/or a scratch-resistant coating.

[0335] FIG. 9 shows an exemplary embodiment of a mobile reality capture device 1′ having multiple light indicators 40, wherein each of the light indicators is assigned to a scan section fixed relative to the mobile reality capture device 1′.

[0336] The left part of the image shows a side view of the mobile reality capture device 1′ and the right part shows a top view of the mobile reality capture device, wherein in the top view only the multiple light indicators 40 are shown.

[0337] By way of example, the multiple light indicators 40 comprise six light indicators 40A,40B,40C,40D,40E,40F such that the arrangement of each of the light indicators on the lateral surface corresponds with its assigned scan section, a “forward” direction 41A (opposite the control element 9, FIG. 2), a “backward” direction 41B, a “forward left” direction 41C and a “forward right” direction 41D, and a “backward left” direction 41E and a “backward right” direction 41F.

[0338] For example, the light indicators may provide, e.g. in real time, an indication of a quality parameter for probing data acquired within the respective scan section, or the light indicators may provide guidance from a current location of the mobile reality capture device 1′ towards an area of the environment, e.g. an area wherein acquisition of additional data is required (see FIG. 10).

[0339] FIG. 10 schematically shows a scanning workflow by using the mobile reality capture device 1′ of FIG. 9, wherein the mobile reality capture device is configured to carry out a data check, e.g. wherein the device is configured to automatically recognize the end of the measuring campaign based on a recognition of a movement towards an exit 42 of a room upon which the data check is automatically triggered.

[0340] The figure shows a top view of a room to be measured, wherein the trajectory 43 of the mobile reality capture device is indicated.

[0341] The device has access to model data of the room, e.g. a three-dimensional map generated by a SLAM algorithm of the device or a pre-defined room model, e.g. from a building information model (BIM).

[0342] In a first area 44, the reality capture device recognizes that the distance to the wall of the room to be measured was outside a nominal distance range to provide optimal point resolution of the laser scanner. Thus, the device is configured to generate guiding data to guide the user towards the recognized area 44, which has insufficient data quality.

[0343] In a second area 45, the mobile reality capture device recognizes a room, which has been missed during the measurement, e.g. by taking into account the pre-defined room model. Also in this case, the device generates guiding data to guide the user to this missed room 45.

[0344] For example, guidance may be provided by means of the multiple light indicators 40A-F (FIG. 9).

[0345] FIG. 11 shows an exemplary embodiment of a laser scanner comprising a referencing element 46 having a curved surface.

[0346] In the embodiment shown, the referencing element 46 has a curved surface, which is arched outwards, i.e. in the direction of the rotating body 14, wherein the referencing element 46 is buried in a depression in a nadir area of the support 12. The lateral walls 4 of the depression, which cross the scanning plane by the rotating laser measurement beam 2, are configured to act as a radiation swamp.

[0347] Therefore, the referencing element 46 has the effect that the outgoing laser measurement beam 2 generates a track on the curved surface, wherein depending on the track position different incidence angles of the outgoing laser measurement beam 2 with the curved surface are generated and different fractions of light are scattered back along the incident direction of the laser measurement beam 2 (see zoom-in view on the right of the figure, wherein the zoom-in view shows the details from a side perspective relative to the full image).

[0348] FIG. 12 shows an exemplary embodiment of a cooling system of a mobile reality capture device, wherein the device has a first area 48, which is free of rotating parts, and a second area 49, which comprises rotating parts of the laser scanner to provide a scanning movement of the laser measurement beam.

[0349] The mobile reality capture device has a pumping device (not shown) for driving an airflow comprising external air, an air entrance 50 to let in the external air 51 into the first area 48, and an air passage 52 to forward air from the first area, i.e. the air that entered over the air entrance 50, into the second area 49. In the embodiment shown, the reality capture device is further configured to separately pass air from the first area into an area comprising cooling ribs 53.

[0350] Furthermore, the cooling system comprises a two-stage filtering system, with a first filter 54, which is at least a rainproof filter, and a second filter 55, which has a finer filter fineness than the first filter 54. The first filter 54 is arranged ant the air entrance 50 and separates the internal space of the cooling system from the ambiance, wherein the second filter 55 is arranged in the internal space and separates the internal space in a dirty inner zone 56 lying upstream of the second filter 55 between the first filter 54 and the second filter 55 and a clean inner zone 57 lying downstream of the second filter between the second filter and an air outlet 58 for releasing air into the ambiance. For example, the air outlet 58 comprises the same kind of filter as the first filter to protect the internal space from contamination by refluxing air from the outside.

[0351] FIG. 13 schematically shows a scanning workflow, wherein redundant data are deleted by taking into account an evaluation of a geometric relationship between an acquisition position and an area to be probed from the acquisition position.

[0352] The figure shows a top view of a room to be measured, wherein the trajectory 43 of the mobile reality capture device is indicated.

[0353] Here, the mobile reality capture device has a data evaluation unit, configured to carry out an evaluation of a geometric relationship between an acquisition position of the mobile reality capture device and an area to be probed. For example, the mobile reality capture device may have a SLAM unit, e.g. a visual SLAM unit, which provides a three-dimensional map of the environment. This allows the evaluation unit to determine the current location 59 of the mobile reality capture device within the three-dimensional map and to derive for this point geometric relations with respect to the surroundings at this point.

[0354] By way of example, the evaluation unit notices that, compared to a previous location 60 of the mobile reality capture device on the trajectory 43, a current distance 61 to a previously captured area 62 is now better suited to match the focal distance of a camera unit of the mobile reality capture device as compared to the distance 63 corresponding to the previous location 60.

[0355] Therefore, image data corresponding to the previous acquisition can be automatically deleted without requiring an extensive on-device data analysis. Thus, redundant data can be deleted close to the sensor, which, for example, has benefits regarding required storage volume and data transfer rate to a companion device.

[0356] FIG. 14 schematically shows a scanning workflow, wherein the mobile reality capture device is configured for re-initialization of a SLAM unit by recalling a relative positional relationship between SLAM features and a position of the mobile reality capture device along the previous trajectory.

[0357] The figure shows a top view of a room to be measured, wherein the trajectory 43 of the mobile reality capture device is indicated.

[0358] The mobile reality capture device has SLAM unit configured to generate a 3D map, wherein the 3D map is generated by identifying different prominent features within the environment, e.g. corners or features with distinct reflection properties. For example, the SLAM unit may be based on a visual SLAM algorithm, wherein the SLAM unit matches prominent features present in continuously generated images to spatially link the scenes represented by these images to each other.

[0359] According to this aspect of the invention, the reality capture device further has a feature tracker. For different points along the trajectory, the feature tracker determines and stores a relative position of identified features 65, i.e. in a local coordinate system 64 associated to the current position. Then, in case of a measurement interruption, e.g. due to a power failure or in case the work from a previous measuring campaign is continued, the user has to re-establish a location close to the last position along the trajectory, wherein the mobile reality capture device is configured to re-initialize the SLAM unit for continuing the generation of the 3D map by recalling a series of relative feature positions corresponding to the most recent positions of the mobile reality capture device along the trajectory 43.

[0360] FIG. 15 shows an exemplary system comprising a mobile reality capture device 1 and a companion device 66, e.g. a tablet or a smartphone.

[0361] In the embodiment shown, the companion device 66 is configured to act as a server of a server-client communication protocol, wherein the mobile reality capture device 1 is configured to act as client. The access data for the server-client communication, e.g. a service set identifier (SSID) and a password for the server, is encoded into a matrix barcode 67, e.g. a QR code, displayed on the companion device 66. The mobile reality capture device has a camera 7, which may take an image of the matrix barcode 67 upon manual trigger by a user, e.g. by pressing the control element 9. The mobile reality capture device 1 is then configured to automatically recognize the matrix barcode 67 in the image, to automatically decode the matrix barcode, and to automatically establish the server-client communication upon decoding the matrix barcode 67.

[0362] FIG. 16 shows an exemplary application of a reality capture device as monitoring device 100 in a crowded area, here for monitoring of a subway station. Typically, a monitoring system comprises a multitude of such monitoring devices 100, which are distributed within the infrastructure in order to provide full coverage with a minimum of blind spots.

[0363] A multitude of moving objects 101, e.g. pedestrians, trains, subway passengers, and marketers, are moving within the area, wherein a monitoring system making use of such monitoring devices 100 may be particularly configured for automatically tracking the moving objects 101 and for automatically detecting a left behind object 102, e.g. a suitcase which could potentially comprise harmful substances or explosives.

[0364] FIG. 17 shows an exemplary embodiment of a reality capture device embodied as monitoring device 100. The top part of the figure shows a side view of the monitoring device 100 and the bottom part of the figure shows a top view of the monitoring device 100.

[0365] The monitoring device 100 comprises a laser scanner 6 configured for a scanning movement of a laser measurement beam relative to two rotation axes, and, based thereof, to generate light detection and ranging (LIDAR) data for generating a three-dimensional point cloud.

[0366] The base 11, which supports the laser scanner is configured as a common sensor platform, which also supports the cameras of a camera unit. In the embodiment shown, the camera unit comprises two visual imaging cameras 103, each visual camera 103 having a field of view of at least 180 degrees, and four thermal imaging cameras 104, each of the four thermal imaging cameras 104 having a field of view of at least 80 degrees.

[0367] The two visual cameras 103 are placed on a circumference around the laser scanner 6 with an angular separation of 180 degrees, and the four thermal cameras 104 are placed on a circumference around the laser scanner 6 with an angular separation of 90 degrees with respect to each other.

[0368] FIG. 18 shows an exemplary embodiment of a receptacle 68 for attaching the mobile reality capture device to an additional component.

[0369] The receptacle 68 has depression for receiving, along a penetration axis 69, a pin 70 of a connector 71 of an additional component to the mobile reality capture device, and a locking mechanism. The locking mechanism comprises locking bodies 72, e.g. spheres, which, in a locking position, are pushed radially inwards towards the penetration axis 69 in order to engage in a cavity 73 of the pin 70. The locking mechanism is configured that the locking position is its basic position. The locking mechanism can be set into a release position by pushing a contact element 74 in a direction along the penetration axis 69, which enables the locking body 72 to radially escape and thus to release the pin 70.

[0370] In the embodiment shown, the locking mechanism comprises a sliding element 75 configured to be axially movable along the penetration axis 69, wherein prestressing springs push the sliding element 75 into a basic position, which pushes the locking bodies 72 radially inwards.

[0371] The connector 71 comprises the pin 70, having a circumferentially continuous cavity 73, wherein the connector has a release mechanism 76 configured to push, in the locking position, the contact element 74 of the receptacle 68 in the direction along the penetration axis 69.

[0372] The top of the figure shows a connector 71 which is currently moved into the receptacle 68. The middle of the figure shows the connector fixed in the receptacle, which is in its locked position.

[0373] The bottom of the figure shows the connector 71 located in the receptacle 68, wherein the release mechanism is activated and the receptacle is in its release position.

[0374] FIG. 19 exemplarily shows a workflow using a mobile reality capture device having a radio signal module, e.g. a WLAN module, for determining a signal strength of a radio signal, which is available along the trajectory 43 of the mobile reality capture device.

[0375] On the top, the figure shows a top view of a room to be measured, wherein the trajectory 43 of the mobile reality capture device is indicated.

[0376] Distributed in the room are a multitude of WLAN transmitters 77, wherein for each WLAN transmitter the propagation of the WLAN signal is indicated. According to this aspect of the invention, the mobile reality capture is configured to provide a data set comprising a series of determined signal strengths of the WLAN signal, wherein each signal strength is associated to a position of the mobile reality capture device along the trajectory 43.

[0377] By way of example, as indicated in the bottom of the image, such a data set may then be used to generate heat map 78 indicating a classification of the room into different radio signal reception areas, e.g. wherein areas of no reception 79, very strong reception 80, strong reception 81, intermediate reception 82, and low reception 83 are identified.

[0378] Furthermore, such data may be used for WLAN based localization, e.g. to be used by a smartphone.

[0379] Although the invention is illustrated above, partly with reference to some preferred embodiments, it must be understood that numerous modifications and combinations of different features of the embodiments can be made. All of these modifications lie within the scope of the appended claims.