METHOD FOR REFERENCING A PLURALITY OF SENSORS AND ASSOCIATED MEASURING DEVICE
20200124406 · 2020-04-23
Inventors
Cpc classification
H04N13/239
ELECTRICITY
G06T7/521
PHYSICS
G01N27/725
PHYSICS
International classification
G01B11/25
PHYSICS
G06T7/521
PHYSICS
H04N13/239
ELECTRICITY
Abstract
A method for referencing a plurality of sensor units that can be arranged around the measurement object for surveying a three-dimensional surface of a measurement object, and to an associated measuring device for surveying a surface of a measurement object. A more accurate surveying of the surface of the measurement object is provided.
Claims
1. A method for referencing first and second sensors arranged around an object to be measured for surveying a three-dimensional surface of the object, wherein each sensor of the first and second sensors comprises a source of structured illumination and optical camera at a fixed distance the source of structured illumination, wherein in each sensor a beam of the source of structured illumination is calibrated with respect to the camera, and a transformation of the image of the structured illumination, which is recorded by the camera, from two-dimensional image points into three-dimensional camera coordinates is determined through the calibration of the sensor, wherein the method comprises: determining positions of a plurality of reference points in two-dimensional image coordinates of cameras of the first and a second sensor, reconstructing the positions of the plurality of reference points in three-dimensional camera coordinates of the first sensor and of the second sensor, determining a transformation between the three-dimensional camera coordinates of the first sensor and the three-dimensional camera coordinates of the second sensor based on the reconstructed positions of the reference points, reconstructing the position of the image of the structured illumination in the three-dimensional camera coordinates of the first sensor and the second sensor based on the reconstructed reference points, determining a triangulated position of the image of the structured illumination in three-dimensional camera coordinates of the first sensor and of the second sensor, and correcting the transformation between the three-dimensional camera coordinates of the first sensor and the three-dimensional camera coordinates of the second sensor based on the triangulated positions of the image of the structured illumination.
2. A method for referencing a plurality of sensors arranged around an object to be measured for surveying a three-dimensional surface of the object, wherein each sensor of the plurality of sensors comprises a source of structured illumination and a calibrated optical camera at a fixed distance from the source of the structured illumination, wherein in each sensor a beam of the source of structured illumination is calibrated with respect to the camera, and a transformation of the image of the structured illumination is recorded by the camera and is determined from two-dimensional image points into three-dimensional camera coordinates through the calibration of the sensor, wherein the method comprises: determining positions of a plurality of reference points in two-dimensional image coordinates of cameras of a first sensor and a second sensor, determining in each case of a coordinate transformation between the camera coordinate system and the coordinate system of the reference points of the first sensor and the second sensor, reconstructing the position of the image of the structured illumination in the three-dimensional camera coordinates of the first sensor and the second sensor based on the determined coordinate transformations, determining a triangulated position of the image of the structured illumination in three-dimensional camera coordinates of the first sensor and of the second sensor, and ascertaining a correction transformation between the reconstructed image and the triangulated image for each of the first and second sensors, wherein the coordinate transformations determined between the camera coordinate system and the coordinate system of the reference points of the first and the second sensors are corrected, wherein referencing from the first sensor to the second sensor is established based on the corrected transformations.
3. The method as claimed in claim 1, wherein the reference points are points of a reference object in two or more poses, wherein the reconstructed and triangulated position of the image of the structured illumination is determined for each of the two or more poses.
4. The method as claimed in claim 1, wherein the first and second sensors comprise laser line sources as the source of structured illumination and the image of the structured illumination is a laser line.
5. The method as claimed in claim 1, wherein the first and second sensors are a plurality of sensors.
6. The method as claimed in claim 5, wherein all of the plurality sensors are referenced directly or indirectly to the first sensor.
7. The method as claimed in claim 5, wherein all of the plurality of sensors are arranged in a single plane.
8. The method as claimed in claim 7, wherein each of the sensors comprise one or more a laser line source as the source of structured illumination and the laser planes of the sensors are essentially coincident.
9. The method as claimed in claim 1, further comprising changing a relative position of the first and second sensors, while the referencing of the sensor is adjusted according to the relative position of the first and second sensors.
10. The method as claimed in claim 1, wherein the plurality of reference points are coplanar.
11. The method as claimed in claim 1, wherein the reconstructing the positions of the plurality of reference points in three-dimensional camera coordinates comprises a Perspective-n-Point method.
12. The method as claimed in claim 1, wherein the correcting the transformation between the three-dimensional camera coordinates of the first sensor and the three-dimensional camera coordinates of the second sensor based on the triangulated positions of the images of the structured illumination comprises a rigid body transformation.
13. A method comprising: surveying an object for use with a wind power installation, wherein the object is surveyed with a plurality of sensors referenced to one another using the method in accordance with claim 1, wherein the plurality of sensors are configured for capturing a surface excerpt of the object.
14. The method as claimed in claim 13, wherein the plurality of sensors are moved relative to the object, and a plurality of surface excerpts combine together to form a total surface of the object.
15. The method as claimed in claim 13, wherein the relative distances of the plurality of sensors to one another are changed at different positions of the object, while the referencing of the sensors is adapted to the relative distances of the plurality of sensors.
16. The method as claimed in claim 15, wherein the relative distances of the plurality of sensors are changed.
17. The method as claimed in claim 13, wherein the surface excerpts are surface profile excerpts that are combined to form a surface profile section, wherein a plurality of surface profile sections are combined into one surface profile of the object.
18. A measuring device for surveying a surface of an object of a wind power installation, wherein the measuring device comprises: a plurality of sensors arranged in a measuring plane; and a position determination unit, wherein the sensors are configured to be referenced in the measurement plane by the method as claimed in claim 1, wherein the position determination unit is configured to determine the position of the measurement plane with reference to a stationary reference.
19. The measuring device as claimed in claim 18, further comprising a movement unit for moving the measurement plane relative to the object being measured.
20. The measuring device as claimed in claim 18, wherein measuring device is for surveying a surface of a rotor blade of the wind power installation.
Description
BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS
[0043] Exemplary embodiments, along with the advantages achieved through the solutions according to the invention, are described below with reference to the appended figures.
[0044]
[0045]
[0046]
[0047]
[0048]
[0049]
DETAILED DESCRIPTION
[0050]
[0051] In this example, the movement unit 5 is an electric motor which moves the measuring device 1 along the longitudinal direction z over a rail (not illustrated) on the ground on which the frame 3 is placed, for example by means of wheels.
[0052] In this example, seven sensors 30 are provided inside the frame 3. The sensors 30 are each aimed from the frame 3 inwards in the measurement plane onto the region in which a measurement object is to be introduced. In this example, two sensors 30, arranged namely at the upper end of the frame 3, are fastened to the frame 3 by means of an advance unit 40. The advance unit 40 makes it possible for the sensor 30, which is fastened to the frame 3 by the advance unit 40, to be moved in the measurement plane. In this example, the advance unit 40 comprises two parallel, linear advance elements 42 that are arranged at vertical partial segments of the frame 3 and which support a horizontal beam movably in the vertical direction y between the two linear advance elements 42. In other exemplary embodiments, only one or more than two of the sensors 30 are fastened to the frame 3 by means of the advance unit 40, preferably in particular all the sensors 30. Each of the sensors 30 can have a dedicated advance unit 40, or a plurality of the sensors 30 can be advanced with a common advance unit 40. The advance makes it possible for the distances of the sensors 30 from the surface of the measurement object to be adjusted in such a way that a resolution of the surveying of the surface is always sufficiently high. This is particularly of great relevance in the case of measurement objects that have large differences in their cross-section.
[0053] Although the sensors 30 are arranged in this exemplary embodiment in a frame 3 of the measuring device 1, all other arrangements in which a plurality of sensors 30 can be arranged around the measurement object for surveying a measurement object are available to the method for referencing, since this is not restricted by the mechanical arrangement of the sensors 30.
[0054]
[0055] The punctiform light transmitted by way of example from the laser light source 32 is split by means of the cylindrical lens 34 into a line. The line emerges from the sensor 30 and onto a surface of the measurement object 2. The incoming laser light 36 is reflected at the surface 2, and enters the camera 39 via the lens 37 as the reflected line 38. The height profile of the surface 2 can be calculated from the offset of the laser line arriving at the camera 39. Laser light section sensors are based on the known principle of laser triangulation, wherein the punctiform light source is broadened into a two-dimensional line. The laser light sensor is only one example of sensors 30 suitable for surveying surfaces which can be employed in the measuring system 1 and in the method described herein.
[0056]
[0057] The position determination unit 50 comprises a position laser 52 and a retroreflector 54. The position laser 52 is stationary, and arranged independently of the frame 3. It does not move when the frame 3 is moved by means of the movement unit 5. The position laser 52 measures the distance from the retroreflector 54, which moves with the frame 3. The retroreflector 54 reflects the radiation arriving from the position laser 52 back to the position laser 52 largely independently of the alignment of the retroreflector 54 in respect of the position laser 52. The retroreflector 54 is preferably guided continuously on a circular or elliptical track. The circular or elliptical track of the retroreflector 54 can develop with reference to an application surface that is fastened to the frame 3 or with reference to the entire frame 3. Since the frame 3 moves in the longitudinal direction Z and the retroreflector 54 simultaneously is located on a circular or elliptical track, a helical type of trajectory results, from which the position and orientation of the frame 3 of the measuring device 1 can be determined at any point in time.
[0058]
[0059] The measuring device 1 is suitable for capturing a three-dimensional surface geometry of a measurement object 2 automatically. In particular in the case of large dimensions of the measurement object 2 and for the high measurement resolution necessary for a meaningful determination of the surface geometry of the measurement object 2, the measurement is accordingly not made from a stationary location of the measuring device 1, but from different positions in that the frame 3 is moved by means of the movement unit 5 along the measurement object 2, and the sensors 30 thus perform a movement relative to the measurement object 2 during the measurement process. A carrier unit, for example in the form of a frame 3 with a plurality of sensors 30 which are, for example, optical triangulation sensors such as laser light section sensors, is for example moved along a rail system at the measurement object 2 and tracked precisely with the aid of a position determination unit 50. The position determination unit 50 is, for example, a position laser 52 that determines the distance to a retroreflector 54 that is attached to the frame 3. A sequence of complete profile sections of the measurement object 2 is thus created. Single measurements of profile sections can be merged to form a three-dimensional total model with high resolution. Autonomous or preprogrammed industrial trucks can here also be employed as the movement unit 5 for moving a carrier unit 3. The portal can also be fastened in a manner permitting free manipulation to an industrial robot in order to be able to describe arbitrary spatial curves as the path of travel along a measurement object.
[0060] The advance component 40, which is configured to adjust the distance of the sensors 30 from the measurement object 2, ensures that the measurement resolution of the surface of the measurement object 2 is sufficiently large regardless of the diameter of the measurement object 2 at the position at which the current profile section is being measured. Through a comparison with, for example, a CAD model, deviations of the three-dimensional total model can be determined.
[0061] A significant sag due to gravity, which occurs in particular in the case of long measurement objects 2 such as the rotor blades of a wind power installation, can be simulated, and taken into account for the evaluation. In the case of rotor blades of a wind power installation for example, the measurement data recorded by the measurement system 1 form the basis for a flow simulation for performance assessment or for acoustic evaluation of the rotor blade.
[0062] With the measuring device 1 it becomes possible for the total measuring time for one rotor blade to be not longer than 30 minutes. In this time a profile section can be taken every 2 millimeters in the longitudinal direction of the measurement object 7 with the measuring device 1. With the measuring system, the local measurement error at the front and rear edges of the profile can be in the range between 0.05 to 0.17 mm on the pressure side and between 0.07 to 0.41 mm on the suction side. A guarantee for the performance figures or the acoustic figures of the rotor blade can be maintained within these tolerance ranges.
[0063] The description of the figures for
[0064]
[0065] The source of structured illumination transmits a beam which reduces to a plane in the case of a laser line. This beam is also calibrated with reference to the camera, which permits a triangulation of the object points at which the incoming light is reflected and which is then received in the optical camera from the ray path of the structured illumination. After the sensor has then been calibrated, the association between two-dimensional image points of the light that was transmitted from the source of structured illumination and was reflected at the measurement object can be assigned to precisely one image point in three-dimensional coordinates that corresponds to the surface of the measurement object. Expressed otherwise, the calibration of the sensor unit enables a precise assignment between the image of the structured illumination that the camera records in two-dimensional image points, and the reflections at the object surface in three-dimensional camera coordinates that underlie this image.
[0066] A problem underlying the invention then occurs when a plurality of sensors that are to be arranged around the measurement object for surveying a three-dimensional surface must be referenced to one another. In particular in the case of complex geometries of the measurement objects, highly curved objects or objects with undercuts or inclusions for example, the use of a plurality of sensors is frequently the only solution for achieving a measurement of the surface of the measurement object within an acceptable time.
[0067] The method 100 for referencing a plurality of sensors comprises initially, in step 110, a determination of the positions of a plurality of reference points in two-dimensional image coordinates by the cameras of a first and second sensors 30. The plurality of reference points can, for example, be arranged on a simple two-dimensional reference object such as a two-dimensional checkerboard pattern. No further requirement is placed on reference points; in other examples, reference points that are not coplanar (for example not in a two-dimensional checkerboard pattern) are also conceivable as reference points. The positions of the plurality of reference points are specified in the same positions of the reference points, meaning for example of the reference object, of at least the first and second sensors that are to be referenced to one another. In the simplest case, the exemplary checkerboard pattern is accordingly recorded in the same position by the first and second camera. The recognition of the reference points in the two-dimensional image coordinates takes place through evaluation of the recorded images.
[0068] In step 120 the position of the plurality of reference points in three-dimensional camera coordinates of the first sensor and of the second sensor is reconstructed. Through the reconstruction of the positions of the plurality of reference points, the position of a plane of the reference object in which the reference points lie can, for example, be determined. Expressed otherwise, for each coordinate transformation between the camera coordinate system and the coordinate system of the reference points the for each of the sensors.
[0069] The principle of the camera calibration is shown schematically with reference to
[0070] For the step 120 then, after the camera calibration has been done successfully, only the reference points in a single pose are necessary. The known camera calibration namely makes it possible for the plurality of reference points from the two-dimensional image data to be reconstructed in three-dimensional camera coordinates. Positions in three-dimensional camera coordinates of the same reference points are thus present for a plurality of sensors.
[0071] In step 130, on the basis of the reconstructed positions, a transformation is then determined between the three-dimensional camera coordinates of the first sensor and the three-dimensional camera coordinates of the second sensor.
[0072] The referencing of a plurality of sensors is described schematically with reference to
[0073] The translation matrix T1 indicates a matrix, as stated, that is composed of the rotation R and the translation vector t. The extrinsic transformation T accordingly enables a transformation of global coordinates to camera coordinates. The Perspective-n-Point algorithm or, preferably, a non-iterative Perspective-n-Point approach, which is known as the efficient Perspective-n-Point algorithm (ePnP), is used to determine this transformation. An extrinsic transformation, for example T.sub.1 for camera 39a and T.sub.2 for camera 39b, can accordingly be obtained for a plurality of sensors in the same pose, for example 601, of the exemplary two-dimensional checkerboard pattern. The registration between the two sensors can then be obtained in the form T.sub.12=T.sub.2 T.sub.1.sup.1. It has, however, been found that sufficient accuracy is not achieved with this kind of sensor referencing. The referencing must accordingly be improved further.
[0074] In a step 140 the position of the image of the structured illumination in the three-dimensional camera coordinates of the first sensor and of the second sensor is reconstructed for this purpose. The structured illumination of the sources 31a, 31b and/or 31c are recorded for this purpose in temporal sequence by the respective cameras 39a, 39b and 39c. The position of the image of the structured illumination reconstructed in step 140 is subject to the same estimation errors as the reconstruction of the reference points determined in step 120.
[0075] In step 150, in addition to the position reconstructed in step 140, a triangulated position of the image based on the calibrated beam of the structured illumination is determined. In contrast to the reconstructed position, the triangulated position is not subject to the error of pose estimation.
[0076] Finally, in step 160, the transformation between the three-dimensional camera coordinates of the first sensor and the three-dimensional camera coordinates of the second sensor are corrected on the basis of the triangulated position determined in step 150.
[0077] The correction of the referencing transformation is described schematically and in an exemplary manner with reference to
[0078] Misalignments, or desired repositionings and tilting of sensors are recognized with the sensor registration, and registered appropriately in the total model. The proposed method thus combines an estimation of the pose that is, for example, performed with the ePnP algorithm, and an approach for determining a rigid body transformation for the trustworthy data that is given through triangulated data for each sensor. The flexibility of the method is based on the applicability concerning other algorithms for estimating the pose, such as for example polar geometry, as well as the reference objects used which can be two-dimensional or three-dimensional and only have to define a plurality of reference points and thus can also be simple geometric shapes. The poses can, for example, also be estimated through the direct linear transformation (DLT), and through the reference objects used for the camera calibration that was previously used. It is only necessary to ensure that enough information of the reference object is present in the overlapping region of the two adjacent camera systems, and that the beams of the structured illumination intersect the reference object.
[0079] Although the illustrated exemplary embodiments show a rotor blade 2 of a wind power installation as an example of a measurement object, the effects and advantages achieved through the invention are also applicable to other measurement objects, in particular long measurement objects of variable cross-section.