METROLOGY SYSTEM
20230099779 · 2023-03-30
Assignee
Inventors
- Duncan REDGEWELL (Nieder Erlinsbach, CH)
- Markus STEINER (Gränichen, CH)
- Thomas LÜTHI (Aarau, CH)
- Veroljub MAKSIMOVIC (Biberist, CH)
- Raimund LOSER (Bad Säckingen, DE)
Cpc classification
G01S7/003
PHYSICS
G01S17/87
PHYSICS
G01S17/66
PHYSICS
G01C11/00
PHYSICS
International classification
Abstract
The invention relates generally to a metrology system and coordinate measuring devices to be used within the framework of a smart factory environment, which has a defined arrangement of different metrology devices, configured such that coordinate measuring data generated by different metrology devices are referencable to a common coordinate system.
Claims
1-107. (canceled)
108. A metrology system comprising: a laser tracker having an opto-electronic distance meter configured to emit a distance measurement beam towards a cooperative target of a moveable accessory device in the direction of an aiming axis, and to provide distance measurement data based on at least a part of the distance measurement beam returning from the cooperative target, an angle determining unit configured to provide angle determination data for determining the pointing direction of the aiming axis, and a tracking unit configured to receive over a tracking channel a tracking signal which is indicative of a change in angular location of the cooperative target, to determine the change in angular location of the cooperative target based on the tracking signal, and to generate control data for adjusting the alignment of the aiming axis based on the determined change in angular location, the movable accessory device, configured for scanning or probing of an object, and a computing unit configured to generate coordinate measuring data for the object based on the distance measuring data, the angle determination data, and the scanning and/or probing by means of the movable accessory device, wherein the system comprises a referencing unit configured to generate referencing data with respect to a reference point associated with the object, the referencing data providing, independently from the opto-electronic distance meter, for a determination of a relative positional change between the referencing unit and the reference point, wherein the referencing unit and the reference point are intended to be arranged with respect to the object such that a positional change between the referencing unit and the reference point is indicative of a movement of the object, and the computing unit is configured to take into account the referencing data to generate the coordinate measuring data.
109. The system according to claim 108, wherein the reference point is embodied by a cooperative target mounted on the object or defined by a reference surface of the object.
110. The system according to claim 109, wherein the referencing unit is embodied as a camera unit being configured to generate 3D data of the object based on imaging data.
111. The system according to claim 110, wherein the camera unit comprises a projector configured to project a pattern of structured radiation onto the object, and a processor configured to determine a 3D model of the object based on a photogrammetric method and by making use of the pattern of structured radiation.
112. The system according to claim 108, wherein the referencing unit is configured to detect a movement of the object in at least one degree of freedom, particularly a vibration of the object.
113. The system according to claim 112, wherein the referencing unit is configured to detect at least one of a change in roll, a change in pitch, and a change in yaw angle of the object.
114. The system according to claim 108, wherein the computing unit is configured to generate the coordinate measuring data by at least one of compensating a movement of the object determined based on the referencing data, ignoring distance measuring data of the opto-electronic distance meter and/or angle determination data of the angle determining unit in case a relative positional change between the object and the referencing unit exceeds an allowed threshold, and flagging the coordinate measuring data in case a relative positional change between the object and the referencing unit exceeds a threshold defined for ensuring a desired measurement accuracy of the coordinate measuring data.
115. The system according to claim 108, wherein the movable accessory device is at least one of a probing or scanning device configured to approach the object and to carry out at least one of a tactile, a laser based, and a camera based coordinate measurement, particularly embodied as a range imaging camera, a stereo-imaging camera, an articulated arm robot, or a white light sensor, a marking device configured to mark the object, a tool and/or manufacturing instrument, and another coordinate measuring device.
116. A coordinate measuring device configured to generate coordinate measuring data, comprising an opto-electronic distance meter configured to emit distance measurement radiation in the direction of an aiming axis, and to provide distance measurement data based on at least a part of the distance measurement radiation, an angle determining unit configured to provide angle determination data for determining the pointing direction of the aiming axis, and a tracking unit configured to receive over a tracking channel a tracking signal which is indicative of a change in angular location of a cooperative target, to determine the change in angular location of the cooperative target based on the tracking signal, and to generate control data for adjusting the alignment of the aiming axis based on the determined change in angular location, a computing unit configured to generate the coordinate measuring data for the object based on the distance measuring data and the angle determination data, wherein the coordinate measuring device comprises a referencing unit configured to generate referencing data with respect to a reference point, the referencing data providing, independently from the opto-electronic distance meter, for a determination a relative positional change between the referencing unit and the reference point, wherein the referencing unit is configured and arranged that the referencing data are referenceable to the coordinate measuring data, and the computing unit is configured to take into account the referencing data to generate the coordinate measuring data.
117. The coordinate measuring device according to claim 116, wherein the referencing unit is embodied as a camera unit being configured to generate 3D data of the object based on imaging data.
118. The coordinate measuring device according to claim 117, wherein the camera unit comprises: a projector configured to project a pattern of structured radiation onto the object, and a processor configured to determine a 3D model of the object based on a photogrammetric method and by making use of the pattern of structured radiation.
119. The coordinate measuring device according to claim 116, wherein the tracking unit comprises a tracking sensor and is configured to generate a tracking beam and to receive the tracking signal based on the tracking beam returning from the cooperative target, by determining a deviation from a zero position on the tracking sensor.
120. A metrology system with at least a first and a second coordinate measuring device in each case configured to automatically track a movable accessory device and to generate coordinate measuring data for determining a position of the movable accessory device, wherein the first and second coordinate measuring device are arranged in a fixed positional relationship such that the coordinate measuring data generated by the first and the second coordinate measuring device are referenceable to a common coordinate system, and the first and the second coordinate measuring device are configured to communicate to each other, and the movable accessory device, configured for scanning or probing of an object, and/or for carrying out an intervention on the object, wherein the first coordinate measuring device is configured to track the movable accessory device and to generate coordinate measuring data when the movable accessory device is located within a first measuring area, the second coordinate measuring device is configured to track the movable accessory device and to generate coordinate measuring data when the movable accessory device is located within a second measuring area, the system is configured to send out a position signal while the movable accessory device is located within the first measuring area, the position signal providing positional information regarding the position of the movable accessory device, and the second coordinate measuring device is configured to initiate tracking the movable accessory device based on the position signal.
121. The system according to claim 120, wherein the first measuring area comprises a transition area interfacing with the second measuring area, the system, particularly the first coordinate measuring device, is configured to send out a trigger signal when the movable accessory device is located within the transition area, and the second coordinate measuring device is configured to initiate the tracking based on the trigger signal.
122. The system according to claim 121, wherein the system is configured to derive an estimated motion path of the movable accessory device, and to send the trigger signal based on the derived estimated motion path.
123. The system according to claim 122, wherein the system is configured to derive a current velocity and/or acceleration parameter of the movable accessory device and/or to derive a current position parameter providing a current position of the moveable accessory device within the transition area indicative of a distance to a boundary of the transition area, and to send the trigger signal based on the derived current velocity and/or acceleration parameter and/or based on the current position parameter.
124. The system according to claim 120, wherein the movable accessory device is at least one of a probing or scanning device configured to approach the object and to carry out at least one of a tactile, a laser based, and a camera based coordinate measurement, a marking device configured to mark the object, a tool and/or manufacturing instrument, and another coordinate measuring device, particularly embodied as a laser tracker.
125. The system according to claim 120, wherein the system is configured to determine a quality parameter for at least one of a coordinate measuring accuracy and a tracking accuracy provided by the coordinate measuring data of the first coordinate measuring device, and the system is configured to send the position signal and/or the trigger signal based on the quality parameter, based on a pre-defined nominal coordinate measuring accuracy to be ensured.
126. The system according to claim 120, wherein the system is configured to determine a potentially upcoming measurement blind spot of the first coordinate measuring device being defined by at least one of a reduced coordinate measuring accuracy by the first coordinate measuring device depending on a relative position of the movable accessory device with respect to the first coordinate measuring device, and a reduced tracking accuracy by the first coordinate measuring device depending on a relative position of the movable accessory device with respect to the first coordinate measuring device, and the system is configured to send the position and/or the trigger signal based on the potentially upcoming measurement blind spot.
127. The system according to claim 126, wherein the potentially upcoming measurement blind spot is defined by a disturbing of a generation of coordinate measuring data of the first coordinate measuring device by an interfering object and/or by a disturbing of a tracking signal for tracking the movable accessory device by an interfering object, wherein the system comprises a camera arrangement configured to generate image data, which are referenceable to the common coordinate system, and a recognizer configured to recognize the interfering object within an image of the image data.
Description
[0133] The inventive aspects are described or explained in more detail below, purely by way of example, with reference to working examples shown schematically in the drawing. Identical elements are labelled with the same reference numerals in the figures. The described embodiments are generally not shown true to scale and they are also not to be interpreted as limiting the invention. Specifically,
[0134]
[0135]
[0136]
[0137]
[0138]
[0139]
[0140]
[0141]
[0142]
[0143]
[0144]
[0145]
[0146]
[0147]
[0148]
[0149]
[0150]
[0151]
[0152]
[0153] The figure shows an exemplary use case, e.g. measuring and quality control of a workpiece 2 in industrial automobile or aircraft production.
[0154] A typical metrology measurement system for determining 3D coordinates of an object 2 comprises a coordinate measuring device with tracking functionality, in the following also referred to as a tracker, configured to automatically track a movable accessory device and to generate coordinate measuring data indicative of the position, and often also the orientation, of the movable accessory device. For example, the tracking capability of the tracker may be provided by at least one of a video tracking unit, a radio frequency tracking unit, and by optical tracking based on emitting a tracking beam towards a cooperative target.
[0155] The movable accessory device is configured for scanning the object 2, e.g. by means of tactile scanning, laser based scanning, and/or camera based scanning, and/or the movable accessory device is configured for carrying out an intervention on the object 2, e.g. for manufacturing and/or marking the object 2. For example, the movable accessory device may be embodied as scanning device configured to approach the object and to carry out a coordinate measurement itself, e.g. wherein the mobile accessory device is a hand-held scanner 4, a tactile probing or scanning device 29,30A,30B (see
[0156] Metrology systems are often configured that coordinate measuring data of accessory devices are typically referenceable to the coordinate system of one of the coordinate measuring devices, e.g. one of the fixed laser trackers, or to an outer coordinate system of a group of coordinate measuring devices.
[0157] By way of example, a movable accessory device is embodied as a handheld scanner 4 configured to emit a local scanning beam 7 in order to scan the object surface in a local coordinate system, wherein the position of the handheld scanner 4 is tracked and measured by a laser tracker 1, and the measuring points of the handheld scanner 4, typically coordinate measuring data in a local coordinate system, are referencable to the coordinate system of the laser tracker 1.
[0158] Both the movable accessory device and the tracker may also be mounted to a robot, e.g. a UGV 6 (“unmanned ground vehicle”) carrying a tracker 1 or a UAV 206 (“unmanned aerial vehicle”) carrying a photogrammetry camera 207.
[0159] By way of example, the tracker is embodied as an industrial laser tracker 1, which provides for high-precise coordinate measuring and tracking of a cooperative target 3, e.g. a passive reflecting unit with defined reflecting properties such as a steel sphere of known dimensions or a retroreflecting unit such as a cubic prism, wherein at least parts of a laser beam emitted by the laser tracker 1 are reflected back, e.g. in parallel, to the laser tracker. Alternatively, the cooperative target may be an active unit emitting radiation having defined emission properties and/or according to a defined radiation pattern, e.g. provided by one or multiple laser diodes or LEDs, which is identified by the tracking unit of the coordinate measuring device. In other words: In the context of the present application, the term “cooperative target” relates to a target specifically foreseen to be used in conjunction with a tracking unit in order to enable generation of a tracking signal. Thus, the cooperative target “cooperates” with the tracking unit in that it has at least one of distinct reflection properties, distinct emission properties, a known shape, and known dimensions.
[0160] The basic structure of a typical laser tracker 1 comprises an opto-electronic distance meter to determine a distance to an object based on a laser measuring beam 8, wherein the aiming direction of the laser measuring beam 8 can be varied in a motorized movement, e.g. with respect to one or more independent spatial directions.
[0161] Opto-electronic laser distance meters have now become standard solutions in many areas, wherein various principles and methods are known in the field of electronic or electro-optical distance measurement.
[0162] One approach is to emit pulsed electro-magnetic radiation, e.g. laser light, to a target to be measured and to subsequently receive an echo from this target as a back-scattering object, wherein the distance to the target to be measured can be determined by the time-of-flight (ToF), the shape, and/or the phase of the pulse. Another approach is to use an interferometric distance measuring principle, particularly an absolute (i.e. frequency scanning) interferometry method, a frequency modulated continuous wave method (FMCW, particularly C-FMCW), the Fizeau principle, and/or a Frequency comb principle.
[0163] Furthermore, the laser tracker 1 comprises a tracking unit for providing an automatic adjustment of the aiming direction of the laser measuring beam 8 such that the measuring beam 8 is continuously tracking a target point, wherein a direction of the emission of the laser measuring beam 8 is determined by means of sensors for angle measurement, e.g. angle encoders.
[0164] By way of example, for the purpose of continuous target tracking, a deviation of a returning tracking beam 9, e.g. part of the distance measuring beam 8 or a separately emitted tracking beam, may be used to determine a deviation from a zero position on a tracking sensor. By means of this measurable deviation, the aiming direction of the tracking beam 9 can be corrected or continuously adjusted in such a way that the deviation on the tracking sensor is reduced.
[0165] As tracking sensor, a position-sensitive detector (PSD) may be used, e.g. an area sensor which functions in an analog manner with respect to position, with the aid of which a centroid of a light distribution on the sensor surface can be determined.
[0166] In order to achieve a high level of accuracy, the visual field of such a PSD is typically selected to be comparatively small, e.g. corresponding to the beam diameter of the tracking beam 9. Therefore, the use of such a PSD-based tracking must be preceded by a coupling of the tracking beam 9 to the cooperative target 3. Thus, problems may occur when the cooperative target 3 moves so abruptly and rapidly that it disappears from the visual range of the PSD detector. Furthermore, coupling may be lost when the line of sight is interrupted, even in case the interruption only occurs during a brief moment.
[0167] In addition, many different workers and/or movable accessory devices such as articulated arm robots 5 may work on the same object. If two cooperative targets 3 cross each other the tracking beam related to the target in the background may be blocked by the target carrier of the target in front. In such cases, the laser tracker 1 may carry out a so called power-lock procedure, e.g. an automatic localization of a cooperative target 3, to automatically re-lock the tracking beam 9 to a cooperative target 3.
[0168] For example, a power-lock unit comprises a position detection sensor configured to generate position data indicative of an impingement position of an impinging light beam onto the position detection sensor, and a light source, e.g. an LED, configured to radiate a search beam in the direction of the aiming axis, wherein a reflection of at least part of the search beam at a cooperative target is visible as an impinging light beam on the position detection sensor.
[0169] However, since typical power-lock procedures only scan in a quite narrow searching field of view, it may happen that instead of re-locking the tracking beam onto the cooperative target in the background the tracking beam is falsely locked onto the cooperative target in front. In the extreme case, the laser tracker may not even notice this wrong re-lock and thus continue to track the wrong target.
[0170] Various principles and methods are known in order to hold the tracking beam 9 in the “coupled” state even during rapid and abrupt movements of the target object or during interruption of the tracking beam 9.
[0171] For example, the laser tracker 1 has a camera configured to have a comparatively wide field-of-view, i.e. configured to capture the cooperative target 3 as well as the target carrier 4,5,6 and a substantial part of the background. Therefore, on the basis of image processing, e.g. by way of a computer vision algorithm to detect and track a worker in a video stream generated by the camera, the track of the target object or movements of objects moving along with the target object can be determined and used to easier locate the cooperative target 3 and to couple (re-lock) the laser beam again in the event that the cooperative target 3 has been lost from the “coupled” state.
[0172] Alternatively, or in addition, tracking (of a target by a tracker or generally tracking a device within the metrology system) may be based on other known position determining devices mounted within the smart factory and/or on the laser tracker and/or on a target carrier, e.g. based on a local radio frequency positioning system 208 such as WLAN positioning or 5G positioning 208. Thus, the tracking signal may be embodied as an optical tracking beam 9 but also as any other kind of wireless transmission signal to exchange positional information between the laser tracker 1 and a target carrier 4,5,6.
[0173] The movable accessory device 4,5 may have some markers (not shown) for pose determination and the tracker may have an orientation camera (not shown), wherein the tracker is configured to determine a 6DoF pose (6 degrees of freedom, i.e. position and orientation) of the movable accessory device by means of image processing. Often, the movable accessory device 4,5 also has additional pose determination means, e.g. an inertial measurement unit and/or a visual inertial system, and is configured for wireless communication with the tracker.
[0174] At least part of the coordinate measuring devices and movable accessory devices of the smart factory/metrology environment are configured to communicate to each other, e.g. by device-to-device communication and/or communication over a central computer. In addition, the smart factory/metrology environment may comprise further auxiliary measuring and detection devices configured to communicate with the coordinate measuring devices and/or movable accessory devices. For example, at least part of the devices of the smart factory may be configured for peer-to-peer communication within a company internal, e.g. secured, 5G network.
[0175] For example, the smart factory may comprise additional surveying devices 10, e.g. a fisheye camera mounted at the ceiling or a camera mounted in a corner of the smart factory, configured to monitor at least a subarea of the smart factory and to recognize objects, e.g. a laser tracker, a movable accessory device, and/or a person. Also, the surveying device may be mobile, e.g. mounted on an AGV or UAV. Thus, in case a laser tracker 1 is unable to find a lost, e.g. temporally decoupled, cooperative target 3, the surveying camera 10 provides a wider search area than the field of view of a power-lock camera of the laser tracker 1 in order to at least provide the laser tracker 1 with a rough location estimate of the lost cooperative target.
[0176] Alternatively, or in addition, since coordinate measuring data generated by different metrology devices within the smart factory are referencable to each other, instead of having a dedicated surveying device 10, a set of trackers and/or movable accessory devices may also be configured to share between each other some of their sensor information as mutual auxiliary search and surveying information. For example, the laser tracker which has lost its cooperative target may query information of a camera of another laser tracker in order to find and locate the lost target.
[0177] The defined measurement environment made available by the smart factory/metrology environment provides for new and/or improved workflows and measurement systems wherein some exemplary aspects are explained in more detail below.
[0178] It goes without saying that a skilled person will recognize that individual aspects of the metrology system and coordinate measuring devices described below can be combined with each other.
[0179] In particular, although not always mentioned explicitly, it is assumed that the term “coordinate measuring device” relates to a device configured to generate coordinate measuring data for determining a position of a measurement point, e.g. based on a laser measurement beam or based on imaging. For example, the term coordinate measuring device may relate to a tracker and/or scanner, e.g. an industrial laser tracker for tracking and precise coordinate determination of a target object or a 3D-scanner for scanning an environment.
[0180] For example, the basic structure of a laser based coordinate measuring device comprises: an opto-electronic distance meter configured to emit a distance measurement beam in the direction of an aiming axis, in the case of a laser tracker towards a cooperative target, wherein the opto-electronic distance meter is configured to determine distance measurement data based on at least a part of returning radiation of the distance measurement beam; and an angle determining unit configured to provide angle determination data for determining the pointing direction of the aiming axis.
[0181] Laser trackers further comprise a tracking unit configured to receive over a tracking channel a tracking signal which is indicative of a change in angular location of the cooperative target, to determine the change in angular location of the cooperative target based on the tracking signal, and to generate control data for adjusting the alignment of the aiming axis based on the determined change in angular location.
[0182] Thus, typically, a laser tracker works in conjunction with a movable accessory device, which is a device configured for scanning an object, particularly for tactile scanning, laser based scanning, and camera based scanning, and/or for carrying out an intervention on the object, particularly for manufacturing and/or marking the object. In particular, the accessory device may also be a further coordinate measuring device, e.g. a further laser tracker or scanner as outlined in the embodiments below.
[0183] The term “cooperative target” relates to a target specifically foreseen to be used in conjunction with the tracking unit in order to enable generation of the tracking signal. In other words, the cooperative target “cooperates” with the tracking unit of the laser tracker in that it has at least one of distinct reflection properties, distinct emission properties, a known shape, and known dimensions.
[0184] A skilled person will appreciate that a metrology system may be configured to recognize different types of cooperative targets, and that whenever a cooperative target is used to represent a particular target point to be measured, the system is configured to resolve any offset or ambiguity between the position of the cooperative target and the actual measurement point represented by that target, e.g. wherein the system may identify different reflector types such as a triple prism and a cateye prism and automatically have access or knowledge of relative addition constants with respect to their types and mounting positions.
[0185] For example, a cooperative target mounted onto a laser tracker may be arranged and configured in such a way, that a measurement onto that cooperative target is referenceable to a defined point of that laser tracker, e.g. the point of origin of the tracking beam generated by that laser tracker being marked with the cooperative target.
[0186] By way of example, one benefit of having a set of trackers being referencable to each other in the smart factory is that seamless measuring with multiple trackers can be provided as schematically depicted by
[0187] According to this embodiment, the metrology system comprises at least a first 10A and a second 10B tracker, e.g. laser trackers. Each of the two trackers 10A,10B is configured to automatically track a movable accessory device, e.g. a handheld scanner 4.
[0188] By way of example, tracking may be based on a tracking beam of a tracking unit as described above, or the trackers 10A,10B may be configured for video tracking of an object captured by a camera of the trackers 10A,10B.
[0189] Furthermore, the two laser trackers 10A,10B are arranged in a fixed positional relationship such that their generated coordinate measuring data are referenceable to a common tracker coordinate system, e.g. a so-called global smart factory coordinate system or a local coordinate system of one of the two laser trackers 10A,10B.
[0190] By way of example, the two laser trackers 10A,10B are configured to provide their raw or processed data to a central computing unit for further processing and for merging the coordinate data of different coordinate systems. Alternatively, central processing may also be carried out by one of the laser trackers 10A,10B, e.g. one of the laser trackers 10A,10B being configured as a dedicated reference laser tracker.
[0191] Furthermore, the two laser trackers 10A,10B are configured to communicate to each other, e.g. by device-to-device communication and/or communication over a central communication node, and each of the two laser trackers 10A,10B is assigned to a different measurement area 11A,11B.
[0192] For example, a measurement area 11A,11B may be defined relative to the respective laser tracker 10A,10B, e.g. wherein a nominal measuring range of the laser tracker defines a nominal measuring area around the laser tracker. A measurement area 11A,11B may also be specifically defined as any area within reach of the laser tracker 10A,10B, i.e. a specific area of the smart factory to be covered by the respective laser tracker. It goes without saying that the measurement area 11A,11B may be adaptable, e.g. depending on different measuring tasks or system conditions.
[0193] The first laser tracker 10A is configured to track the handheld scanner 4 when it is located within a first measuring area 11A and the second laser tracker 10B is configured to track the handheld scanner 4 when it is located within a second measuring area 11B. Referring now to
[0194] In
[0195] By way of example, the first laser tracker 10A may further be configured to estimate a motion path 14 of the handheld scanner 4, e.g. in order to specifically address a suitable laser tracker to potentially “take over” the movable accessory device or to let a central computing unit estimate the motion path 14, choose and address a suitable laser tracker to take over. In particular, the laser tracker 10A or the central computing unit may be configured to estimate a time period until the movable accessory device leaves the first measuring area.
[0196] The position signal, here the trigger signal 13, at least comprises positional information for the handheld scanner 4 such that, as depicted by
[0197] For example, the second laser tracker 10B may be configured to lock onto the handheld scanner 4 as soon as the scanner 4 is within a handover area 16 as shown by
[0198] The handover prerequisites may be essentially “static” in the sense that fixed measuring areas 11A,11B are assigned to different laser trackers 10A,10B. However, the measuring areas 11A,11B and/or the transition area 12 may also be dynamically adapted, e.g. depending on a current measuring situation and/or a current arrangement of objects in the smart factory.
[0199]
[0200] By way of example, the measuring area of the first laser tracker 10A is generally given as a circular measuring range around the laser tracker and the first laser tracker 10A comprises a panoramic camera arrangement configured to provide 360 degreepanoramic image data. The interfering object 17 is then recognized within the panoramic image data based on image processing and a feature recognition algorithm. Furthermore, the laser tracker is configured to determine at least a rough position of the interfering object 17 within its field of view 18A and to determine a blind spot area 19 where its tracking beam is potentially blocked by the interfering object 17.
[0201] The first laser tracker 10A then requests a situational handover comprising a comparison of the measuring areas and/or field of views 18B of surrounding laser trackers 10B with its own measuring area and the determined blind spot area 19, and a determination of a transition area 12 based on the comparison, i.e. for a handover procedure with another laser tracker 10B as previously described.
[0202] Alternatively, or in addition, a tracker may be configured to be automatically movable in order to avoid a blind spot by repositioning to another tracker location (e.g. see
[0203]
[0204] For example, the movable laser tracker 20 is configured to move according to a fixed guidance system installed in the smart factory, i.e. essentially along a fixed track or the movable laser tracker 20 is configured for free movement, e.g. wherein the laser tracker 20 and/or the AGV 21 are configured to be moved based on a positioning or guidance system such as 5G positioning, radio positioning, e.g. based on ultra-wideband radio signals, or WLAN positioning, or wherein the laser tracker 2 and/or the AGV 21 are configured for simultaneous localization and mapping (SLAM).
[0205] Furthermore, the movable laser tracker 20 comprises a measuring aid 22 configured in such a way and arranged in a fixed relationship with the laser tracker 20 such that a movement and orientation of the measuring aid 22 is indicative of a movement and orientation of the movable laser tracker 20. For example, the measuring aid 22 is mounted on the laser tracker 20 (or the AGV 21) or is embodied as an integral part of the movable laser tracker 20. In particular, the movable laser tracker 20 may comprise one or multiple such measuring aids 22, e.g. arranged at different faces of the laser tracker.
[0206] The measuring aid 22 comprises a retroreflector 23 configured to be tracked by a reference laser tracker 26 (
[0207] Therefore, as depicted by
[0208] Furthermore, the laser tracker is configured to determine at least a rough position of the interfering object 17 within its field of view 18 and to determine a blind spot area 19 where its tracking beam is potentially blocked by the interfering object 17. Based on this information a suitable re-location position 25 for avoiding the measurement blind spot 19 can be determined.
[0209] The movable laser tracker 20 can then be sent to the re-location position 25, wherein at least the end position of this movement is determined by a reference laser tracker 26. For example, before starting to move the movable laser tracker 20 sends out a trigger command 13 in order to request tracking by an external referencing device 26, e.g. wherein the trigger command comprises information on the current position of the movable laser tracker 20 and/or on the re-location position 25.
[0210] As shown by
[0211] By way of example, a fixedly installed reference laser tracker 26 is used with respect to which the coordinate measuring data of multiple laser trackers and other movable accessory devices are referenced. However, it is also possible that one of the movable laser trackers 20 is configured to act as reference laser tracker. In particular, a dedicated external computing unit 27 in connection to at least the reference laser tracker 26, may be foreseen for data processing and data merging.
[0212] Thus, the reference laser tracker 26 and a multitude of movable laser trackers 20 form a variable laser tracker arrangement, wherein each change of the arrangement is automatically tracked, i.e. the reference laser tracker 26 and each movable laser tracker 20 except the last movable laser tracker in the measuring chain are configured to automatically track a respectively assigned movable laser tracker 20, e.g. wherein each movable laser tracker 20 is configured to send out a trigger command 13 for requesting a tracking by its upstream laser tracker. In other words, each movable laser tracker 20 is configured to act as a relay device 28 between the reference laser 26 tracker and a movable accessory device, here in the form of an articulated arm robot 5.
[0213] In industrial metrology a wide range of different sensors and measurement devices may be used, wherein typically each sensor and device having multiple operation modes requires dedicated setup procedures. Quite often correct setup and combination of different devices for a particular measurement procedure requires some degree of expert knowledge.
[0214]
[0215] By way of example, the plurality of coordinate measuring devices comprises two fixed laser trackers 26 and a movable laser tracker 20. The plurality of movable scanning and/or probing devices comprises a handheld scanner 4, a tactile 6DoF probe 29, and a set of different retroreflecting prisms 30A,30B.
[0216] One of the fixed laser trackers comprises an examination arrangement 31 configured for generating examination data, e.g. an examination camera. The examination arrangement may be configured to automatically recognize an object placed within an examination field of view 32. Alternatively, the examination arrangement may be configured that generation of examination data is triggered by user input.
[0217] The examination arrangement 31 is coupled to a computing unit 27 comprising an evaluator, configured to provide classification of a movable accessory device 30B captured by the examination data, and comprising a database with preset information for the plurality of coordinate measuring devices 20,26 associated with respect to different scanning and/or probing devices 4,29,30A,30B to be used in combination with the coordinate measuring devices 20,26.
[0218] According to one embodiment, the system is configured to automatically carry out a recognition of a movable accessory device 4,29,30A,30B placed in the examination field of view 32 of the laser tracker and to automatically classify the recognized movable accessory device. The image data of the examination arrangement 31 is then provided to the computing unit 27 providing classification of the recognized prism 30B into a corresponding probe class, e.g. based on a comparison of the image data of an examination camera with template images of the plurality of accessory devices 4,29,30A,30B stored in an image database.
[0219] Based on the classification, the computing unit 27 automatically provides dedicated preset instructions to the plurality of coordinate measuring devices 20,26.
[0220] In particular, presetting may comprise at least one of switching-on at least a subgroup of the coordinate measuring devices and carrying out a calibration procedure to calibrate the subgroup of coordinate measuring devices, e.g. to load relative addition constants of the recognized prism 30B and to reference each device of the subgroup with respect to a common coordinate system.
[0221] In addition, the system may further comprise a database with operating instructions for at least part of the coordinate measuring devices and/or at least part of the accessory devices. For example, the system comprises a user aid device, e.g. video glasses 33, configured to provide step-by-step operating instructions to a user based on the classification, particularly wherein the user aid device is configured to work with augmented reality technology, e.g. is embodied as augmented reality glasses or as an augmented reality helmet.
[0222] By way of example, the operating instructions comprise at least one of instructions for setting a movable accessory device, instructions for setting a coordinate measuring device, instructions for using a movable accessory device in conjunction with a coordinate measuring device, and cautionary information, particularly with regard to user safety and measurement accuracy.
[0223] For example, presetting of the devices and/or provision of operating instructions may further be based on the skill level of a user operating the system, e.g. wherein the system is configured to read information from an RFID tag carried by a user, the RFID tag comprising information about a user category assigned to the user. For example, the system has access to a database comprising user details such as a list of system users and/or a list of user categories linked to information regarding levels of user training and/or user authorization.
[0224]
[0225] By way of example, the system comprises an examination camera 35, e.g. a RIM camera mounted to a user aid device 33, configured for generating examination data, e.g. 3D range imaging data.
[0226] Furthermore, the system comprises an evaluator, e.g. based on an image processing and feature extraction algorithm, for classification of the examination data in order to identify a measurement area 34 on an object 2 to be measured, i.e. wherein the measurement area 34 requires to be measured by a dedicated measuring procedure, e.g. wherein some measurement areas 34 require punctual coordinate measurements by a tactile probe 29 at some reference positions while other measurement areas 34 require a local scan with a certain point resolution, e.g. by a stereo-camera arrangement. The examination camera 35 may be configured to automatically generate examination data, e.g. based on sequential or continuous data acquisition, and to automatically recognize different measurement areas 34. Alternatively, the examination camera may be configured that generation of examination data is triggered by user input.
[0227] In order to provide the necessary computing power for classification of the examination data, identification of the measurement area 34 may be executed on a dedicated computing unit 27 connected to the examination camera 35.
[0228] Based on the identified measurement area 34, a user is provided with an instruction 36 to use a specific movable accessory device, e.g. a tactile 6DoF probe 29, particularly together with instructions on how to use the specific movable accessory device, e.g. instructions for setting the specific movable accessory device and an associated laser tracker as well as how to use the accessory device in conjunction with the associated laser tracker.
[0229] For example, the provision of the measurement instructions 36 for the identified measurement area 34 are provided as visual and/or acoustic instructions, e.g. wherein written instructions are provided to a user by video glasses 33, particularly working with augmented reality technology to directly indicate at least approximate measuring positions within the measurement area 34.
[0230] In particular, the provision of the measurement instructions 36 may consider a training level of the user, e.g. wherein the user carries an RFID identification tag 37 providing a user category with associated user skills indicating usable measurement devices.
[0231] For example, based on the classification 38 of the measurement area 34, the system determines a preselection 39 of suitable movable accessory devices for executing the necessary measurement tasks for measuring the measurement area, wherein the preselection 39 is further reduced based on the user ID in order to provide the final selection 36 of accessory devices and corresponding instructions.
[0232] By way of another example, the system is configured that the provision of the measurement instructions is based on a settable nominal measurement parameter for the measurement area, e.g. wherein a user defines a nominal point density or accuracy to be achieved, i.e. necessitating a particular accessory device to be used, and/or the system may be configured to perform a tradeoff analysis to optimize the overall measurement time.
[0233]
[0234] By way of example, in a metrology area a plurality of different measurement devices is present, e.g. coordinate measuring devices such as a static laser tracker 1 and a movable laser tracker 20, and a variety of movable accessory devices such as tactile 6DoF probes 29, retro-reflective prisms 30A,30B, handheld scanning devices 4, and stereo-scanners, e.g. a stereo-camera mounted on an articulated arm robot 5.
[0235] Often a particular measuring task may be carried out using different sets of measuring devices, wherein selection of an appropriate set of devices typically requires an overview of available measurement devices and some level of expert knowledge.
[0236] In particular, in an extensive metrology environment, e.g. a large construction hall with many workers, an overview of the availability of measurement devices can be cumbersome because many measuring tasks are running in parallel and many workers are using devices from the same set of devices at the same time. In addition, some devices may be subject to degradation effects or even be broken or under repair.
[0237] Furthermore, it has to be ensured that workers or devices carrying out different measuring tasks do not interfere each other, e.g. by blocking each other's laser measuring beams.
[0238] According to this embodiment of the invention, the metrology system is configured to capture status information indicative of a current status of a plurality of measurement devices and to use the status information to determine a suitable group of measurement devices from the plurality of measurement devices to carry out a given measurement task.
[0239] For example, the system may comprise a central computing unit 27 configured to have access to a predefined list of measurement tasks to be carried out, wherein for each measuring task at least one group of device types is defined which may be used for carrying out the task. A user may then select one measuring task 40 out of the list of measuring tasks to be carried out next. Alternatively, instead of selecting a pre-defined measuring task the system may also be configured that a user sets different measurement parameters and thus defines a new measuring task 40 and which types of measurement devices may be suitable.
[0240] In addition, the system comprises a communication network 41 configured to provide communication between measurement devices 1,4,5,20,29,30A,30B and the computing unit 27. Some measurement devices 4,29,30A,30B may not be able to directly communicate with the computing unit 27. However, when unused they may be stored in a storage unit 42 having dedicated shelfs for placing each device type 4,29,30A,30B and being configured to recognize if a device is stored and ready to use. The storage unit 42 is further configured for communication over the communication network 41 and thus provides at least information on the availability of these devices 4,29,30A,30B. Additionally, the storage unit 42 may further be configured to provide further status information on the stored measurement devices 4,29,30A,30B, e.g. wherein the status information is automatically derived or manually stored by a user.
[0241] Upon selection/definition of the measuring task 40 to be carried out next, or, for example, in a continuous manner, the system is configured to capture the actual status of the plurality of measurement devices, e.g. wherein a broadcast or multicast signal is sent over the communication network 41 requesting a response from the measurement devices 1,5,20 and the storage unit 42. For example, the response may comprise at least positional information indicative of a current location or availability of the respective measurement device 1,4,5,20,29,30A,30B.
[0242] Based on the selection/definition of the measuring task 40 and the actual status of the measurement devices 1,4,20,29,30A,30B the system determines a group of measurement devices, particularly with suitable measurement instructions to be used for the measurement task 40.
[0243] The group of measurement devices is then made recognizable to a user by visual 43 and/or acoustic 44 notifications, e.g. wherein the measurement devices are configured to provide a red (occupied) and green (available) color code or simply an on/off code (e.g. LED on=available, LED off=occupied), and/or an acoustic pattern.
[0244] Alternatively, or in addition, the system may comprise a pointer unit (not shown) configured to emit a directed visual pointing beam towards measurement devices to be used, i.e. to mark measurement devices by pointing to them. For example, a laser tracker 1 may be configured to point to a movable scanner and/or probe 4,29,30A,30B which has to be used in conjunction with the tracker 1. The computing unit 27 may also be connected to a display (not shown), configured to provide information, e.g. device locations, about the group of measurement devices.
[0245] The system may further be configured to, particularly continuously, gather the status of the plurality of measurement devices, e.g. at least the position and an activity indication, in order to determine a free movement area 45 for interacting with the group of measurement devices and/or a restriction area 46 which shall not be entered. Within the free movement area 45 the group of measurement devices to be used for carrying out the measuring task 40 or any other object, e.g. a person carrying a handheld scanner 4 for carrying out a local scan of an object to be measured, can freely move without the risk of disturbing a measurement task carried out by the remainder of the plurality of measurement devices. The other way round, an area associated with a measurement task carried out by the remainder of the plurality of measurement devices may be determined to be a restriction area 46.
[0246] The extent and location of a free movement area 45 and/or a restriction area 46 are made visible to the workers in the metrology environment, e.g. by dedicated LED markings 47,48 on the ground or by a video projecting system configured to project dedicated markings onto a surface in the smart factory.
[0247] In addition, the system may be configured to monitor the restriction area 46 and to generate an alarm signal in case a person and/or object enters the area, e.g. based on a monitoring system using RFID tags, indicating access right to different areas 45,46 of the metrology environment.
[0248]
[0249] The system is a measurement environment comprising coordinate measuring devices, e.g. fixed laser trackers 1 and movable laser trackers 20, having positions which are referenceable with respect to an outer coordinate system.
[0250] By way of example, in a conventional remote support system an onsite worker wears smart glasses equipped with a camera configured such that a remote expert has access to a current view of the onsite operator wearing the smart glasses. Thus, the remote expert can guide the onsite worker through a task to fix a problem.
[0251] However, even having a realtime onsite view it may be cumbersome to guide the onsite worker 49 through a complicated measuring task, e.g. involving correct setup of different measuring devices at different locations within the measurement environment and approaching various measurement points.
[0252] According to this embodiment of the invention, guiding an onsite worker 49 is particularly sped up and facilitated in that the remote expert 50 has the possibility to remotely use onsite devices 1,20 for pointing at different measurement devices and locations where the onsite worker 49 needs to take some action, e.g. measuring or investigating something.
[0253] In order to provide this pointing functionality, the system is configured to process inner referencing data from a mobile user aid device 51, e.g. smart glasses with a camera 52, the inner referencing data being indicative of a position and orientation of the mobile user aid device 51 relative to the fixed measurement environment. For example, the inner referencing data may be imaging data, distance measuring data, and/or data from an inertial measuring unit of the user aid device 51.
[0254] Thus, based on the inner referencing data and outer referencing data indicative of the measurement environment represented in an outer coordinate system, the remote support system can determine a position and orientation of the mobile user aid device 51 relative to the outer coordinate system.
[0255] Furthermore, the system is configured to read input data provided by the remote expert 50, the input data defining guiding points within the measurement environment, e.g. a position of a measurement device 53 or a position of a point to be measured 54. For example, the remote expert 50 may explicitly provide coordinates of the guiding points 53,54 in the outer coordinate system and/or the remote expert may provide a point ID, e.g. a particular measurement device, wherein the point ID is automatically linked by the system to associated point ID coordinates. Then, based on this input data, a coordinate measuring device, e.g. a laser tracker 1,20 configured to generate a visual laser pointing beam 55, is instructed to point to one of the guiding points 53,54.
[0256] By way of example, as depicted by
[0257] In particular, the outer coordinate system may also be generated based on data from one of the measurement devices 1A,1B,20, e.g. image data from a camera or point cloud data generated by a laser scanning unit. For example, the outer referencing data may be based on a so called full dome scan by one of the measurement devices 1A,1B,20, i.e. imaging or point cloud data providing an essentially 90° (vertical) times 360° (azimuthal) field of view. In particular, a photorealistic 3D model can be derived from these data of one of the measurement devices, such that a geometrical 3D twin of the measurement environment is accessible for the remote expert 50.
[0258]
[0259] By way of example, typical laser trackers often have a visual target localization functionality for automatically or at least semi-automatically lock a tracking beam 9 to a cooperative target 3. Sometimes this functionality is called “power-lock procedure”, e.g. wherein the laser tracker 100 automatically scans a perimeter to find a cooperative target 3.
[0260] However, typical power-lock procedures have a quite narrow searching field of view 56 for localizing a cooperative target 3. Thus, either by moving the cooperative target 3 or by adjusting the orientation of a visual target localization unit associated with the power-lock functionality, at least a coarse pre-alignment of the cooperative target 3 and the target localization unit has to be carried out in order to bring the target 3 inside the searching field of view 56 of the visual target localization unit.
[0261] By way of example, the visual target localization unit may comprise a light source, particularly an LED, configured to radiate a search beam in the direction of the aiming axis, wherein a reflection of at least part of the search beam at the cooperative target is visible as a target point to a position detection sensor. Alternatively, or in addition, target localization may also be carried out by radio signal based localization, e.g. by using radio frequency telegram transceiver (RFTT) modules referenced to the coordinate measuring device and the cooperative target, respectively, e.g. wherein a first and a second RFTT anchor-module's position is referenced to the laser tracker and a RFTT tag-module is referenced to the cooperative target such that a rough location of the cooperative target is determined based on transmission of radio frequency telegrams between the RFTT anchor- and tag modules.
[0262] According to this embodiment of the invention, the laser tracker 100 comprises an acoustic localization unit, e.g. with at least two microphones for stereo-localization, configured to detect and identify an acoustic identifier signal 57. For example, the identifier signal 57 may be a pass phrase or pass word which is specific for a particular laser tracker 100 or a particular group of laser trackers, e.g. “hey laser tracker!” for generally addressing a laser tracker or “find me laser tracker five!” for specifically addressing the fifth laser tracker out of a group of laser trackers. Upon detection and identification of the pass phrase 57 the laser tracker 100 automatically determines the direction of origin of the pass phrase 57 and automatically adjusts the alignment of the aiming axis in order to align the aiming axis onto the determined direction of origin, i.e. providing a coarse alignment in order to bring the target 3 inside the searching field of view 56 of the visual target localization unit.
[0263] In addition, the laser tracker may be configured to generate a visible light cone 58 indicative of the field-of-view 56 of the visual target localization unit. Thus, in addition to the adjustment of the orientation of the visual target localization unit, the worker holding a cooperative target 3 may also manually adjust the position of the cooperative target 3 in order to bring it inside the visible light cone 58, e.g. in case the acoustic driven alignment is slightly off the cooperative target 3.
[0264]
[0265] By way of example, the system comprises a laser tracker 1 to determine the position of a handheld scanner 4 configured to carry out a laser based measurement of an object to be measured 2. Based on the coordinate measuring data by the laser tracker 1 and the measurement by the movable accessory device, here the handheld scanner 4, a computing unit 27 determines positional data of the object to be measured 2, e.g. for comparison with specified manufacturing tolerances for that object 2, e.g. a comparison with CAD data of the object.
[0266] In addition, the system comprises a referencing device 59, configured to generate referencing data with respect to a reference point, wherein the referencing data are indicative of a distance change between the referencing device 59 and the reference point. For example, the reference point may be a point on the object to be measured 2, e.g. a cooperative target mounted on the object to be measured. Alternatively, the reference point may be arranged away from the object to be measured 2, while the referencing device 59 is arranged at the object to be measured, namely in such a way that a distance change between the referencing device 59 and the reference point is indicative of a movement of the object to be measured 2.
[0267] In other words, the referencing device 59, the reference point, and the object to be measured 2 are arranged with respect to each other that the referencing device 59 is able to pick up a movement of the object to be measured 2 in at least one degree of freedom, e.g. a vibration 61 of the object to be measured 2.
[0268] For example, the referencing device 59 is embodied as a stereo scanner with two cameras 63 arranged spaced apart from each other, and a projector 64 configured to project a pattern 65 of structured radiation onto the object 2, e.g. a fringe or Moiré projection pattern. The stereo scanner is configured to determine a 3D model of the object 2 based on a stereo-photogrammetric method and by making use of the pattern of structured radiation 65
[0269] Therefore, the computing unit 27 can be provided with the measuring data of the stereo scanner in order to compensate any movement of the object 2 when determining the positional data. By way of example, without having the referencing data, an inherent vibration of the object 2 or another unwanted movement of the object, e.g. caused by a shock when the worker accidentally hits the object 2, would have gone unnoticed in the measurement by the movable accessory device.
[0270] Alternative to the computing unit 27 being separate, it may also be part of the tracker determining the position of the movable accessory device or part of the referencing device 59.
[0271]
[0272] As schematically shown by
[0273] Each laser tracker of the system further has a target localization unit configured for localizing a cooperative target 3 within the perimeter of the respective laser tracker, e.g. a cooperative target of another laser tracker, and for deriving localization data configured to orient the tracker aiming axis towards the localized cooperative target 3.
[0274] Thus, by way of example, each laser tracker may be configured to automatically search its perimeter to localize one or a plurality of cooperative targets 3, and to automatically generate local coordinate measuring data with respect to one or more localized cooperative targets 3.
[0275] The local coordinate measuring data of different laser trackers 101A,101B may then be exchanged between laser trackers and/or provided to a central computing unit, enabling the system to reference the local coordinate measuring data of at least part of the plurality of laser trackers to a common coordinate system.
[0276] By way of example, in case of complex objects to be measured more than a single position of a laser tracker and/or a plurality of different laser trackers are needed. In order to combine the data acquisition of multiple positions of a single tracker or from multiple trackers an identical point-of-interest (POI), typically more than three POls, is needed to be measured from all positions/by all trackers, in order to calculate a transformation between different local coordinate systems.
[0277] By way of another example, an automatic referencing to a common coordinate system may be carried out at an initialisation of the system, e.g. wherein all laser trackers of the system are referenced to each other.
[0278] In another example, a group of laser trackers may be defined to carry out a specific measuring task, wherein only the group of laser trackers is referenced to the common coordinate system. In particular, the system may be configured to monitor the group of laser trackers, in order to automatically detect a change or problem within the group upon which automatically a new referencing is carried out.
[0279] For example, in case a laser tracker has to be repaired and is thus replaced by another laser tracker, the added replacement tracker is automatically referenced to the group coordinate system, e.g. wherein the replacement tracker measures a set of reference points and/or wherein a set of laser trackers of the plurality of trackers determines relative distances between the set of laser trackers and the replacement tracker. Thus, the coordinate measuring data of the replacement laser tracker may be “integrated” into the common group coordinate system based on triangulation or multilateration principles.
[0280]
[0281] In this example, the referencing tracker 102 and one of the group laser trackers 103A are embodied as fixed stations, i.e. having a fixed position within the metrology environment, wherein two of the group laser trackers 103B,103C are embodied as movable laser trackers, e.g. according to any embodiment of the inventive movable laser tracker. Furthermore, the referencing tracker 102 is connected to a computing unit 27 and configured to communicate with each of the group laser trackers 103A,103B,103C in order to forward coordinate measuring data of each of the group laser trackers to the computing unit 27.
[0282] By way of example, the referencing tracker 102 itself determines the position of one of the moveable group laser trackers 103B, which in turn determines positional data with respect to the other movable group laser tracker 103C, which in turn determines positional data with respect to the stationary group laser tracker 103A. In addition, the stationary group laser tracker 103A provides positional data with respect to the referencing tracker 102.
[0283] Therefore, the computing unit 27 is then able to reference the group laser trackers 103A,103B,103C with respect to each other based on the knowledge that the referencing tracker 102 is a fixed point in the common coordinate system.
[0284] It goes without saying that the system may make use of additional position determining means, e.g. inertial measuring units on the movable laser trackers 103B,103C or a GPS based positioning system, in order to resolve an ambiguity of an underdetermined system or for simplifying and/or speeding up the referencing. For example, instead of only determining a relative position, at least some of the plurality of laser trackers may be configured to additionally determine a pose, i.e. position and orientation, of another laser tracker.
[0285]
[0286] By way of example, particularly in case of large objects to be measured, e.g. overhanging aircraft components in a production hall, a worker 49 carrying a movable accessory device having a cooperative target 3 may need to take different measurement positions, e.g. walk around the object to be measured. This may lead to some interruptions of a tracking beam 9, e.g. when the tracking beam is blocked by the worker itself when he is turning around.
[0287] In addition, many different workers may work on the same object to be measured. If two workers cross each other the tracking beam related to the worker in the background is blocked by the person in front. In such cases, the laser tracker may carry out a power-lock procedure to find a cooperative target and to automatically re-lock the tracking beam to a cooperative target. However, since typical power-lock procedures typically have a quite narrow searching field of view, it may happen that instead of re-locking the tracking beam onto the movable accessory device of the worker in the background the tracking beam is falsely locked onto the movable accessory device of the worker in front. In the extreme case, the laser tracker may not even notice this wrong re-lock and thus continue to track the wrong worker.
[0288] In many cases, the worker may also need to stow the accessory device for repositioning, e.g. to securely climb a ladder to reach the next scanning position. In this case, a camera may be used to track the position of the worker as a coarse position of the movable accessory device, e.g. by way of a computer vision algorithm to detect and track the worker in a video stream generated by the camera, in order to enable a quick re-lock of the tracking beam onto the cooperative target of the movable accessory device as soon as the worker has reached the next scanning position.
[0289] According to this embodiment of a laser tracker 104, the laser tracker 104 comprises a typical tracking unit configured to receive over a tracking channel a tracking signal, e.g. based on a tracking beam 9 (
[0290] In addition, the laser tracker 104 has a second tracking unit comprising a camera configured to generate image data, wherein the second tracking unit is configured for video tracking of a target carrier, e.g. a housing or support structure of the cooperative target 3, or a machine or person 49 carrying the cooperative target 3.
[0291] The second tracking mode makes use of a recognizer—e.g. based on a computer vision algorithm to detect and track objects in a video stream—for determining a position of the target carrier, e.g. a worker 49, within an image of the image data generated by the camera. In particular, recognition indicia indicative of the appearance of the worker 49 imaged by the camera are determined by image processing, e.g. wherein the recognizer is configured to recognize the target carrier 49 based on pre-defined recognition information.
[0292] For example, the pre-defined information may be stored on a local computing unit of the laser tracker 104 and/or the laser tracker 104 may be configured for bi-directional communication with a remote computing unit having stored the pre-defined information. By way of example, such pre-defined information comprises at least one of recognition indicia for a type of the target carrier, e.g. particular shape of a machine type or a color of a worker's uniform; recognition indicia for a specific target carrier, e.g. an indicia enabling distinction between two machines or uniformed workers of the same type; an identification code of the target carrier designating a particular type; information indicative of the spatial arrangement of the cooperative target relative to the target carrier; and positional information for the target carrier, e.g. an absolute position of the target carrier provided by GPS or a radio frequency positioning system.
[0293] By way of another example, the laser tracker 104 is configured for training the recognizer for a particular target carrier 49 wherein recognition indicia indicative of the appearance of the target carrier 49 imaged by the image data are determined by image processing. For example, training may be based on image data generated in a locked state, wherein the tracking channel is undisturbed, namely such that the tracking signal is receivable without unscheduled interruption and the aiming axis is continuously adjusted based on the control data such that it is continuously following the cooperative target.
[0294] Therefore, the continuous lock onto the cooperative target makes the target carrier stay roughly at the same position in the images of the first image data, whereas the background and other untracked objects are moving. Thus, one can make use of this knowledge about the coarse position of the target carrier in the camera images as well as about its behavior when it is moving to learn the visual appearance of the imaged target carrier, e.g. from all sides of the carrier in case the carrier is moving and turning.
[0295] Furthermore, the laser tracker 104 has conventional visual target localization unit making use of a position detection sensor for localizing the cooperative target 3 within the field of view of the position detection sensor, and for deriving localization data configured to provide an alignment of the tracking beam 9, i.e. the aiming axis, of the conventional tracking unit with respect to the cooperative target 3.
[0296] In addition, the laser tracker 104 is configured to support the conventional visual target localization unit by video tracking in that the laser tracker 104 is configured to derive a motion parameter indicative of the target carrier 49 being in motion based on the image data of the second tracking unit. Thus, the visual target localization unit can be activated based on the motion parameter for providing the alignment of the aiming axis with respect to the cooperative target 3.
[0297] For example, the laser tracker 104 may automatically notice that the worker has reached the next scanning position because the worker now stands essentially still, upon which the laser tracker 104 automatically initiates a power-lock procedure to re-lock the tracking beam 9 onto the cooperative target 3. In particular, the laser tracker 104 may further be configured to take into account a position of the target carrier determined by the video tracking in combination with a known arrangement of the cooperative target 9 relative to the target carrier 49 in order to speed up the power-lock procedure by the visual target localization unit. By way of example, the laser tracker 104 is configured for video tracking of a human worker and to expect the cooperative target 9 approximately at chest height as soon as the worker stands still.
[0298] Thus, as depicted by
[0299] As soon as he reaches the next measuring position, the laser tracker 104 recognizes that the worker 49 stands still and automatically starts a power-lock procedure based on knowledge of the worker position determined by video tracking and based on knowledge of an expected position of the cooperative target 9, here at chest height of the worker 49. Therefore, despite the narrow search field of view 56 of the visual target localization unit the laser tracker 104 is able to quickly find the cooperative target 9 in order to re-lock the tracking beam 9 onto the cooperative target 9.
[0300]
[0301] Here, the smart factory comprises one or multiple deflecting units 211, e.g. comprising a movable mirror, separate from the coordinate measuring device 210. In the example shown, the coordinate measuring device 210 and the deflecting unit 211 are arranged in a fixed positional relationship with respect to each other, wherein the relative position of the coordinate measuring device 210 and the deflecting unit 211 with respect to each other, the movement direction of the laser beam 8 with respect to the coordinate measuring device 210, and the orientation of the movable mirror of the deflecting unit 211 is known by the system. Therefore, the system is configured to generate scanning data providing coordinates of measurement points based on the known relative positions and orientations, the coordinate measuring data provided by the coordinate measuring device, and control data indicating the orientation of the deflecting unit 210.
[0302] By way of example, when using the deflecting unit 211, the laser measurement beam 8 is directed onto the deflecting unit 211 and then kept fixed, e.g. at the so-called “geometric mirror centre”, so that the directional parameters defining the overall targeting by the laser measurement beam 8 are only dependent on deflection settings of the deflecting unit 211.
[0303] For example, the mirror 211 is mounted at a two-axis motorized rotation device (not shown) such that it can make fast and precise movements based on control commands from the coordinate measuring device 210. The precise mirror orientation may be detected by high-resolution angular encoders, e.g. wherein the encoders have a defined zero-position. The deflecting unit 211 may have a CPU included which can send and receive commands, e.g. wherein the deflecting unit 211 and the coordinate measuring device 210 are connected by an EtherCat real-time interface 212.
[0304] In the embodiment depicted by the figure, coordinate measurement is based on a so-called “stable beam pointing”, wherein the laser measurement beam 8 is kept fixed at a fixed point with respect to the two rotation axes (so-called “geometric mirror centre”). At this mode all beam movements are done by the deflecting unit 211 only, here by setting different orientation of the movable mirror.
[0305] For example, the deflecting unit 211 and the coordinate measuring device 210 are configured and communicate with each other, such that the mirror 211 provides for generally known functionalities of the coordinate measuring device 210, for example an automatic target recognition functionality, e.g. wherein find reflector commands are established via mirror sensor movements, a power-lock functionality wherein the reflector is locked on via mirror movements, or a tracking functionality for tracking of the reflector by the mirror movement only, e.g. wherein movement commands based on a position sensitive detector of the coordinate measuring device are sent to the deflecting unit via a real-time interface. Furthermore, by providing a sufficient mirror diameter the mirror 211 and the coordinate measuring device 210 may interact with each other to provide typical 6DoF-functionalities of the coordinate measuring device, e.g. wherein a laser tracker has a 6DoF-camera for detecting and analysing markings on a 6DoF measurement probe via the image of the 6DoF measurement probe seen via the mirror.
[0306] For example, the mirror 211 may further comprise a configuration of orientation markings (not shown), e.g. a set of at least three LEDs, the configuration of orientation markings being mounted and arranged such that the orientation markings are co-moving with the mirror. Therefore, the coordinate measuring device 210 is configured to determine the mirror orientation by acquiring and analysing an image, which comprises at least part of the configuration of orientation markings. The determined mirror orientation can then be used to provide the control data indicating the orientation of the deflecting unit 210 in order to derive the 6DoF orientation of the measurement probe, i.e. by further analysing the image of the 6DoF measurement probe seen via the mirror.
[0307] By way of another example, the mirror 211 may be rotated to become an autocollimation mirror for detecting the coordinate of the mirror position. In particular, the mirror 211 has a set of reference markings 213, particularly at least three, arranged in a fixed relationship such that the geometric centre of the mirror can be detected, e.g. by means of image analysis of an image of the mirror captured by the coordinate measuring device 210.
[0308] In particular, the smart factory may comprise a plurality of deflecting units, wherein the system is configured that different coordinate measuring devices may interact (not at the same time) with the same deflecting unit, that one coordinate measuring device may interact with multiple deflecting units, or that the coordinate measuring device may only interact with a dedicated deflecting unit assigned to that coordinate measuring device.
[0309]
[0310] For example, the scanning device is a handheld scanning device as described above, wherein an image of the object to be measured 2 is provided by a conventional camera (not shown). Thus, the system may make use of fast feature detection by a segmentation algorithm known in the prior art, e.g. to detect edges, planar surfaces, textures, and/or different colors of the object 2.
[0311] By way of example, the provided setting parameter may indicate perimeter boundaries 216 for scanning the features, which may be essentially instantly visualized in an image of the object. More particularly, the system may comprise a projector, configured for directly projecting the boundaries 216 onto the object. Furthermore, based on the identified features of interest 214, best scan settings for the scanning device 215, e.g. defining optimal resolution conditions for scanning the feature may be automatically pre-set so that the operator of the scanning device 215 can essentially instantly start scanning the features.
[0312] By way of another example, the scanning device 215 may be part of an automated system configured to automatically measure a test object, e.g. wherein the scanning device is mounted on a robot arm. The system may then be configured to use the provided setting parameter to automatically pre-set the scanning device and to take into account the setting parameter, e.g. providing perimeter boundaries referenceable to a local coordinate system of the robot arm, for controlling the robot arm to automatically scan the test object.
[0313] Although the invention is illustrated above, partly with reference to some preferred embodiments, it must be understood that numerous modifications and combinations of different features of the embodiments can be made. All of these modifications lie within the scope of the appended claims.