COORDINATE MEASURING SYSTEM

20170292827 · 2017-10-12

    Inventors

    Cpc classification

    International classification

    Abstract

    A system for measuring spatial coordinates of a measurement object, comprising a mobile computer device comprising a first optical sensor for capturing image data of the measurement object; a pose determination unit comprising an external tracking sensor, wherein the external tracking sensor is embodied separately from the mobile computer device and is configured to capture pose data indicative of a pose of the mobile computer device; and a control unit configured to determine the spatial coordinates of the measurement object on the basis of the image data of the measurement object and the pose data of the mobile computer device.

    Claims

    1. A system for measuring spatial coordinates of a measurement object, comprising: a mobile computer device comprising a first optical sensor for capturing image data of the measurement object; a pose determination unit comprising an external tracking sensor, wherein the external tracking sensor is embodied separately from the mobile computer device and configured to capture pose data indicative of a pose of the mobile computer device; and a control unit configured to determine the spatial coordinates of the measurement object based on the image data of the measurement object and the pose data of the mobile computer device.

    2. The system as claimed in claim 1, wherein the control unit is integrated into the mobile computer device.

    3. The system as claimed in claim 1, further comprising an external computer device, on which at least part of the control unit is implemented, wherein the external computer device is connected via a data connection to the pose determination unit and the mobile computer device.

    4. The system as claimed in claim 1, wherein the control unit is configured to assume the measurement object to be time-invariant when evaluating the image data of the measurement object to determine the spatial coordinates.

    5. The system as claimed in claim 1, wherein the external tracking sensor comprises a second optical sensor, and wherein the pose data of the mobile computer device comprise image data of a monitoring region including the mobile computer device.

    6. The system as claimed in claim 5, wherein the second optical sensor comprises two stationary cameras.

    7. The system as claimed in claim 1, wherein the pose determination unit furthermore comprises an internal position and/or location capture sensor, which is integrated into the mobile computer device and is configured to capture data with regard to position and/or location of the mobile computer device, and wherein the control unit is configured to determine the spatial coordinates of the measurement object also on the basis of the data captured by the internal position and/or location capture sensor.

    8. The system as claimed in claim 1, wherein the mobile computer device furthermore comprises a third optical sensor for capturing image data of the environment of the mobile computer device, wherein the control unit is configured to identify at least one stationary reference point in the image data of the environment of the mobile computer device and to determine a position and location of said reference point relative to the mobile computer device, and wherein the control unit is configured to determine the spatial coordinates of the measurement object also on the basis of the determined position and location of the at least one identified reference point relative to the mobile computer device.

    9. The system as claimed in claim 8, wherein the control unit is configured to determine whether the external tracking sensor is imaged in the image data captured by the third optical sensor.

    10. The system as claimed in claim 5, wherein the mobile computer device comprises a display and an optical marker, and wherein the control unit is configured to determine the pose of the mobile computer device within the image data of the monitoring region by means of the optical marker.

    11. The system as claimed in claim 10, wherein the optical marker is arranged fixedly on the mobile computer device.

    12. The system as claimed in claim 10, wherein the control unit is configured to generate the optical marker on the display.

    13. The system as claimed in claim 12, wherein the control unit is configured to vary a representation and/or position of the optical marker on the display over time.

    14. The system as claimed in claim 13, wherein the control unit is configured to vary the representation and/or position of the optical marker on the display depending on the pose data of the mobile computer device.

    15. The system as claimed in claim 13, wherein the control unit is configured to synchronize the image data of the measurement object captured by the first optical sensor with the image data of the monitoring region captured by the second optical sensor, on the basis of the temporally varied representation and/or position of the optical marker.

    16. The system as claimed in claim 1, wherein the first optical sensor comprises a telecentric optical unit or a plenoptic optical unit.

    17. A method for measuring spatial coordinates of a measurement object, comprising the following steps: providing a mobile computer device comprising a first optical sensor; capturing image data of the measurement object by means of the first optical sensor; capturing pose data indicative of a pose of the mobile computer device by means of a pose determination unit comprising an external tracking sensor, wherein the external tracking sensor is embodied separately from the mobile computer device; and determining the spatial coordinates of the measurement object on the basis of the image data of the measurement object and the pose data of the mobile computer device.

    Description

    BRIEF DESCRIPTION OF THE DRAWINGS

    [0057] Exemplary embodiments of the invention are shown in the drawings and are explained in greater detail in the following description. In the figures:

    [0058] FIG. 1 shows a schematic illustration of a first exemplary embodiment of the system according to the disclosure;

    [0059] FIG. 2 shows a first exemplary embodiment of a mobile computer device which can be used in the system according to the disclosure;

    [0060] FIG. 3 shows a second exemplary embodiment of the mobile computer device;

    [0061] FIG. 4 shows a schematic illustration of the system according to the disclosure in a further exemplary application;

    [0062] FIG. 5 shows a block diagram for schematically illustrating the components of the system according to the disclosure in accordance with one exemplary embodiment;

    [0063] FIG. 6 shows a block diagram for schematically illustrating the components of the system according to the disclosure in accordance with a further exemplary embodiment; and

    [0064] FIG. 7 shows a block diagram for schematically illustrating the components of the system according to the disclosure in accordance with a further exemplary embodiment.

    DESCRIPTION OF PREFERRED EMBODIMENTS

    [0065] FIG. 1 shows a system in accordance with one exemplary embodiment of the present disclosure. The system is designated therein in its entirety by the reference numeral 10.

    [0066] The system 10 is illustrated schematically in FIG. 1 on the basis of the example of one possible case of application. A user 12 of the system 10, e.g. a manufacturing employee, measures therein a measurement object 14 with the aid of the system 10. The measurement object 14 is, for example, a workpiece (here with two schematically illustrated drilled holes) which is situated on a conveyor belt 16 in front of the user 12. The measurement task to be performed by the user involves for example determining the diameter of the drilled hole 18 present in the measurement object 14. In addition, part of the measurement task may be to measure the flatness of the surface in which the drilled hole 18 is situated.

    [0067] Instead of a large, structurally complex, relatively immobile measurement set-up of a coordinate measuring machine usually used for such tasks, the system 10 according to the disclosure for measuring the spatial coordinates of the measurement object 14 is comparatively small and capable of mobile use. The system 10 comprises a mobile computer device 20. Said mobile computer device 20 is preferably a tablet computer. By way of example, it is possible to use an iPad Air 2 WiFi plus cellular™, since this device combines a large number of the functions which are required for the system 10 according to the disclosure and are explained below. In principle, however, the use of a smartphone or laptop is also conceivable.

    [0068] A first exemplary embodiment of such a mobile computer device 20 is illustrated schematically in FIG. 2. It comprises a first optical sensor 22, which can be used to capture image data of the measurement object 14. The first optical sensor 22, which is preferably embodied as a camera, is preferably suitable both for capturing individual images (photographs) and for capturing entire image sequences (videos). “Image data” are thus understood to be either individual images or entire image sequences.

    [0069] In the present exemplary embodiment, the mobile computer device 20 furthermore also comprises a display 24 and a further optical sensor 26, which is designated as third optical sensor 26 in the present case. The third optical sensor 26 is preferably arranged on the same side of the mobile computer device 20 as the display 24. By contrast, the first optical sensor 22 is preferably arranged on the opposite side, such that the optical sensors 22, 26 have opposite viewing directions, as is illustrated schematically with the aid of the arrows 28 (see FIG. 2).

    [0070] A further component part of the system 10 according to the disclosure is a pose determination unit 30 comprising a tracking sensor 32 for capturing data with regard to the position and location of the mobile computer device 20. Said tracking sensor 32 is embodied as an external tracking sensor, that is to say that it is not integrated into the mobile computer device 20, but rather tracks the position and location thereof externally. The external tracking sensor 32 preferably comprises one or more cameras. In the exemplary embodiment illustrated schematically in FIG. 1, the external tracking sensor 32 comprises two cameras 34, 36, which are installed as stationary cameras in the space and are directed at the processing station or the mobile computer device 20. Said cameras 34, 36 thus capture image data of a monitoring region in which the mobile computer device 20 is also situated. From said image data it is possible to ascertain, as explained in greater detail further below, the position and location of the mobile computer device 20 and thus also the position and location of the first optical sensor 22. It goes without saying that, instead of two cameras 34, 36, in principle three or more cameras may also be part of the external tracking sensor 32. Equally, it is also conceivable to use just a single camera, for example a 3D camera.

    [0071] FIG. 5 shows a block diagram which schematically illustrates the fundamental components of the system according to the disclosure and their interconnection. In addition to the components already mentioned, the system 10 according to the disclosure comprises a control unit 38. In the exemplary embodiment illustrated schematically in FIG. 5, the control unit is embodied as an internal component of the mobile computer device 20. The control unit 38 preferably comprises a processor or computer chip on which corresponding image evaluation and control software is installed. Moreover, the control unit 38 preferably contains nonvolatile memories or technology for wired or wireless access to such memories in order to be able to store or retrieve again data relevant to the measurements, such as results, test plans, parameter definitions, etc., in a machine-readable manner. Besides data storage, if appropriate, outsourced processing of calculations may also be effected (cloud computing). The control unit 38 is configured to determine the spatial coordinates of the measurement object 14 on the basis of the image data of the measurement object ascertained by the first optical sensor 22 and also on the basis of the data ascertained by the external tracking sensor 32 or the cameras 34, 36 with regard to the position and location of the mobile computer device 20. The following boundary conditions are important for this type of determination of the spatial coordinates of the measurement object 14: It goes without saying that the first optical sensor 22 is a calibrated sensor. The latter should be calibrated at least insofar as its aperture angle is unambiguously known, such that later the pixel distances ascertained in the image data can be converted into real distances. It is furthermore assumed that the position and location of the mobile computer device 20 can be ascertained at every point in time with the aid of the image data captured by the cameras 34, 36, for example with the aid of known triangulation methods. A further advantageous boundary condition is the assumption that the measurement object 14 is temporally invariant, i.e. is a rigid and motionless body.

    [0072] Therefore, the spatial position and location (pose) of the image recording system 20 are thus known unambiguously and at every point in time since they are supplied by the external tracking sensor 32. The additional assumption of the time invariance of the measurement object 14 then allows correction of the imaging differences (such as e.g. different working distance and hence magnification or reduction) and the imaging aberrations (such as e.g. distortion) in the individual images of an image sequence that are supplied by the first optical sensor 22, and creation of a continuous and accurate 3D reconstruction of the measurement object 14 from the entire corrected image sequence. For this purpose, by way of example, at least two images are recorded from the measurement object 14 with the aid of the first optical sensor 22, wherein these two images are recorded in different positions and/or locations of the mobile computer device 20 and thus also of the first optical sensor 22. The position and location of the mobile computer device 20 (and thus also of the first optical sensor 22) at the point in time of capturing the two images mentioned can be ascertained exactly on the basis of the image data obtained by the cameras 34, 36. Size and position changes of the imaging of the measurement object 14 from one of the two images to the other can then be linked with the ascertained position and location change of the first optical sensor 22, such that ultimately the real dimensions within the two images captured by the first optical sensor 22 can be determined unambiguously as a result. The control unit 38 is preferably configured ultimately to calculate a 3D point cloud of the measurement object 14 with the aid of the method mentioned above, wherein the coordinates of these points can be represented in an unambiguously defined, known coordinate system. Measurements with an accuracy in the range of one or a few micrometers are possible in this way.

    [0073] Instead of an embodiment of the control unit 38 as a component completely integrated into the mobile computer device 20, an embodiment is likewise conceivable in which at least parts of the control unit 38 and/or of the data storage are implemented in an external computer or in a cloud. FIG. 6 shows such an embodiment in a schematic block diagram. The control unit 38 therein is not only integrated into the mobile computer device 20 but also transferred to an external server 40. Said external server 40 can be connected via a data connection 42 to the internal control unit 44 of the mobile computer device 20. The coupling can be effected for example via a data interface 46 of the mobile computer device 20. The data connection 42 may be either a wired connection or a wireless connection. The external server 40 may be either a real server or a virtual server (cloud) that can be accessed via the internet or some other network.

    [0074] The exemplary embodiment illustrated in FIG. 6 has the advantage in comparison with the exemplary embodiment illustrated in FIG. 5 that a large part of the computational effort occurs outside the mobile computer device 20. This not only saves rechargeable battery power, but also prevents excessive heating of the mobile computer device 20. Avoiding such excessive heating of the computer device 20 is of immense importance in the present case in particular since relatively great temperature changes can lead to measurement errors of the first optical sensor 22.

    [0075] The above-explained measurement principle of the system 10 according to the disclosure can be optimized with regard to the precision thereof with the aid of a multiplicity of further system features. In order to simplify the optical capture of the position and location of the mobile computer device 20, for example a plurality of optical markers 48 can be represented on the display 24. In accordance with one exemplary embodiment of the present disclosure, the shape and/or position of said optical markers 48 on the display 24 can be changed in a predefined manner over time. This enables for example an automated, temporal synchronization of the external tracking sensor 32 (cameras 34, 36) and of the mobile computer device 20 with the first optical sensor 22 incorporated therein. Likewise, it is also possible, however, to change the optical markers 48 represented on the display 24 depending on the position and location of the mobile computer device 20. For this purpose, the mobile computer device 20 preferably comprises an internal pose determination sensor 50 (see FIG. 7), which together with the external tracking sensor 32 can be used as the pose determination unit 30. The data supplied by this internal position and location sensor 50 can be used not only for making the measurement more precise but also, in the example mentioned above, for changing the position and/or shape of the optical markers 48. This has the advantage that the optical markers 48 would thus be adaptable in such a way that they can be optimally identified by the cameras 34, 36 at any point in time.

    [0076] FIG. 7 shows a block diagram which schematically illustrates an exemplary embodiment of the type mentioned last wherein the pose determination unit 30 comprises not only the external tracking sensor 32 but also an internal pose determination sensor 50. A multiplicity of possible sensors are appropriate as internal pose determination sensors 50, e.g. a gyrometer, an acceleration sensor, a GPS sensor, a barometer, etc. It goes without saying that, according to the disclosure, the mobile computer device 20 can also comprise a plurality of these pose determination sensors 50.

    [0077] In comparison with the embodiments illustrated schematically in FIGS. 5 and 6, the mobile computer device 20 in accordance with the exemplary embodiment illustrated in FIG. 7 also comprises the third optical sensor 26, already mentioned further above, in addition to the first optical sensor 22. Said third optical sensor can be used essentially for the following functions in the system 10 according to the disclosure: The environment of the mobile computer device 20 can be observed with the aid of the third optical sensor 26. By way of example, it is thereby possible to identify one or a plurality of stationary reference points 52 (see FIG. 1) on the basis of which the position and location determination of the mobile computer device can be made even more precise by means of evaluation of the image data obtained with the aid of the third optical sensor 26.

    [0078] A further possibility for application of the third optical sensor 26 in the system 10 according to the disclosure is as follows: The image data captured by the third sensor 26 can also be evaluated as to whether the cameras 34, 36 are visible in said image data. The consideration on which this type of evaluation is based consists in the fact that lack of visibility of the cameras 34, 36 in the image data captured by the third optical sensor 26 is a strong indication that the cameras 34, 36 also do not have an unrestricted view of the mobile computer device 20. If such a case is detected, the control unit 38 can discard the corresponding image data of one or both cameras 34, 36. This saves data capacity and increases the robustness of the position and location determination.

    [0079] Further exemplary embodiments of the mobile computer device 20 and of the system 10 according to the disclosure are illustrated schematically in FIGS. 3 and 4. The first optical sensor 22′ and the third optical sensor 26′ are provided therein for example with additional optical units 54, 56, which can preferably be arranged in a releasable manner on the mobile computer device 20. Said optical units 54, 56 may be so-called clip-on optical units, for example, which can be pushed or clipped onto the optical sensors 22′, 26′ of the mobile computer device 20. This is conceivable particularly in cases in which the reproducibilities of the imaging conditions of the optical units 22, 26 of the mobile computer device 20 are not sufficient to be able to achieve the desired accuracies. For these cases, the clip-on optical units can be designed such that the optical unit in the mobile computer device 20 need no longer be adjusted. That is to say that possible desired changes, e.g. with regard to the working distance and/or the magnification, would be transferred to the clip-on optical unit. The control of this adjustable clip-on optical unit is preferably effected via the mobile computer device 20 or the control unit 38 or 44 thereof. Necessary algorithms, for example for assessment of contrast, sharpness and illumination, may likewise already be contained in said clip-on optical units e.g. in a machine-readable manner or ID chips may be installed, such that algorithms, calibration parameters, etc. that are relevant to the operation of the respective optical unit can be retrieved by a server or from a memory.

    [0080] In particular telecentric clip-on optical units 54, 56 are advantageous for metrological applications or dimensional measurement. Telecentric optical units fundamentally differ from the customary optical units installed in mobile computer devices 20. Objects are imaged with a distance-independent scale by telecentric optical units. This is ideal for the measurement since uncertainties in the positioning of the imaging system do not translate into uncertainties of the imaging scale which directly limit the achievable measurement accuracy.

    [0081] Furthermore, the clip-on optical units 54, 56 can also be configured such that a so-called plenoptic or light field recording becomes possible with the sensors 22′, 26′ of the mobile computer device 20. Particularly chips having an extremely high number of pixels (greater than 40 MPx), such as are becoming increasingly widespread in many mobile computer devices 20, offer a good basis for this. The “transformation” of the normal terminal cameras 22, 26 into a plenoptic camera has the advantage that the computer device 20 is directly able to generate 3D information about the measurement object 14 from individual recordings. Accordingly, with the use of a plenoptic camera in combination with the spatial and location ascertaining effected by the external tracking sensor 32, there is an increase in the stability and/or accuracy of the measurement with the aid of the system 10 according to the disclosure by means of which the 3D contour of the imaged measurement object 14 is reconstructed from the image data.

    [0082] Furthermore, as indicated schematically in FIG. 4, it may be advantageous if the system according to the disclosure furthermore comprises one or a plurality of illumination devices 58. By way of example, a clip-on optical unit 54 can be placed onto the first optical sensor 22′, a stereoscopy module realized via color-selective optics (red/blue) being integrated into said clip-on optical unit. In this exemplary application, the illumination device 58 may be configured to illuminate the measurement object 14 in a spatially and/or temporally modulated manner, e.g. in a red and blue striped manner.

    [0083] Instead of or in addition to the clip-on optical units 54, 56 pushed onto the sensors 22′, 26′, a corresponding clip-on optical unit 60 can also be placed onto the display 24 (see FIG. 3). Such a clip-on optical unit may be designed for example in a holographic fashion, in a refractive fashion, in a diffractive fashion, in a specularly reflective fashion, in a color-sensitive fashion, in a polarization-sensitive fashion or as a combination thereof. A clip-on optical unit 60 would likewise be conceivable which alternatively or supplementarily comprises a combination of Fresnel optics and micro-optics by which the light emerging from the display 24 is firstly focused cell by cell and then directed under the display-side camera lens 26′, 56. Instead of a large working distance and a large field of view, the third optical sensor 26′ in this case can then be adapted to a smaller working distance and a smaller field of view, but a larger resolution. By progressively switching on the individual illumination cells in the display 24 and recording the images that respectively arise in this case, resolution-enhancing methods, so-called angular illumination methods, become accessible. By way of example, a so-called ptychographic sensor can thus be realized. All directions of incidence could be realized by rotating the mobile computer device 20 about an axis parallel to the viewing direction 28 of the camera 26′. In this case, accurate movement of the mobile computer device 20 is not necessary since the position and location thereof are externally captured simultaneously by the external tracking sensor 32. Using this or other so-called angular illumination methods, it is possible to overcome resolution limitations of the generally low-aperture optical units of such mobile computer devices 20.

    [0084] With the aid of such clip-on optical units 60 it would also be possible to use the display 24 as illumination for the measurement object 14. This is illustrated schematically in the situation depicted in FIG. 4. In comparison with the situation illustrated in FIG. 1, the user 12 holds the mobile computer device 20 the other way round, that is to say with the display 24 facing in the direction of the measurement object 14. A type of striped projection could then be projected onto the measurement object 14 via the display 24, this projection being advantageous particularly in a measurement of the flatness of surfaces. It goes without saying that, in the situation illustrated in FIG. 4, the third optical sensor 26′ is used instead of the first optical sensor 22′ for capturing the image data of the measurement object 14. Instead, in this situation it is possible to use the first optical sensor 22′ for identifying the reference points 52 and thus for ascertaining the position and location of the mobile computer device 20. In this case, the optical markers 48′ are preferably realized as static optical markers that are arranged fixedly on the mobile computer device 20. Said optical markers, as already mentioned further above, serve for simplified identification of the mobile computer device within the image data of the external tracking sensor 32 that are captured by the cameras 34, 36.

    [0085] Further sensors of the mobile computer device 20 may support the system 10 according to the disclosure as follows: The identity of the user 12 can be captured via a face recognition or fingerprint sensor. If appropriate, as a result it is possible to load preset user parameters from archives. The identity of the user 12 may equally well be stored together with the measurement results in corresponding databases in an automated manner. Furthermore, it is possible to capture the motor characteristics or idiosyncrasies of the user 12 in order, depending thereon, to examine the quality of the measurement results or speeds of measurements and to relate the latter to use and/or trajectory parameters and/or environmental parameters. Measurement sequences can possibly be optimized as a result. Feedback messages or instructions can equally well be passed on to the user 12 by being output acoustically with the aid of a loudspeaker or being passed on to the user 12 in tactile form with the aid of vibration actuators or being displayed to the user 12 via the display 24.

    [0086] Overall, a multiplicity of application possibilities are thus conceivable with the system 10 according to the disclosure. The system 10 according to the disclosure essentially affords the advantage, however, that a relatively exact coordinate measuring machine which is exceptionally capable of mobile use can be simulated with commercially available standard components.