COORDINATE MEASURING SYSTEM
20170292827 · 2017-10-12
Inventors
Cpc classification
G06T7/246
PHYSICS
G01B11/26
PHYSICS
G01B11/25
PHYSICS
International classification
G01B11/00
PHYSICS
Abstract
A system for measuring spatial coordinates of a measurement object, comprising a mobile computer device comprising a first optical sensor for capturing image data of the measurement object; a pose determination unit comprising an external tracking sensor, wherein the external tracking sensor is embodied separately from the mobile computer device and is configured to capture pose data indicative of a pose of the mobile computer device; and a control unit configured to determine the spatial coordinates of the measurement object on the basis of the image data of the measurement object and the pose data of the mobile computer device.
Claims
1. A system for measuring spatial coordinates of a measurement object, comprising: a mobile computer device comprising a first optical sensor for capturing image data of the measurement object; a pose determination unit comprising an external tracking sensor, wherein the external tracking sensor is embodied separately from the mobile computer device and configured to capture pose data indicative of a pose of the mobile computer device; and a control unit configured to determine the spatial coordinates of the measurement object based on the image data of the measurement object and the pose data of the mobile computer device.
2. The system as claimed in claim 1, wherein the control unit is integrated into the mobile computer device.
3. The system as claimed in claim 1, further comprising an external computer device, on which at least part of the control unit is implemented, wherein the external computer device is connected via a data connection to the pose determination unit and the mobile computer device.
4. The system as claimed in claim 1, wherein the control unit is configured to assume the measurement object to be time-invariant when evaluating the image data of the measurement object to determine the spatial coordinates.
5. The system as claimed in claim 1, wherein the external tracking sensor comprises a second optical sensor, and wherein the pose data of the mobile computer device comprise image data of a monitoring region including the mobile computer device.
6. The system as claimed in claim 5, wherein the second optical sensor comprises two stationary cameras.
7. The system as claimed in claim 1, wherein the pose determination unit furthermore comprises an internal position and/or location capture sensor, which is integrated into the mobile computer device and is configured to capture data with regard to position and/or location of the mobile computer device, and wherein the control unit is configured to determine the spatial coordinates of the measurement object also on the basis of the data captured by the internal position and/or location capture sensor.
8. The system as claimed in claim 1, wherein the mobile computer device furthermore comprises a third optical sensor for capturing image data of the environment of the mobile computer device, wherein the control unit is configured to identify at least one stationary reference point in the image data of the environment of the mobile computer device and to determine a position and location of said reference point relative to the mobile computer device, and wherein the control unit is configured to determine the spatial coordinates of the measurement object also on the basis of the determined position and location of the at least one identified reference point relative to the mobile computer device.
9. The system as claimed in claim 8, wherein the control unit is configured to determine whether the external tracking sensor is imaged in the image data captured by the third optical sensor.
10. The system as claimed in claim 5, wherein the mobile computer device comprises a display and an optical marker, and wherein the control unit is configured to determine the pose of the mobile computer device within the image data of the monitoring region by means of the optical marker.
11. The system as claimed in claim 10, wherein the optical marker is arranged fixedly on the mobile computer device.
12. The system as claimed in claim 10, wherein the control unit is configured to generate the optical marker on the display.
13. The system as claimed in claim 12, wherein the control unit is configured to vary a representation and/or position of the optical marker on the display over time.
14. The system as claimed in claim 13, wherein the control unit is configured to vary the representation and/or position of the optical marker on the display depending on the pose data of the mobile computer device.
15. The system as claimed in claim 13, wherein the control unit is configured to synchronize the image data of the measurement object captured by the first optical sensor with the image data of the monitoring region captured by the second optical sensor, on the basis of the temporally varied representation and/or position of the optical marker.
16. The system as claimed in claim 1, wherein the first optical sensor comprises a telecentric optical unit or a plenoptic optical unit.
17. A method for measuring spatial coordinates of a measurement object, comprising the following steps: providing a mobile computer device comprising a first optical sensor; capturing image data of the measurement object by means of the first optical sensor; capturing pose data indicative of a pose of the mobile computer device by means of a pose determination unit comprising an external tracking sensor, wherein the external tracking sensor is embodied separately from the mobile computer device; and determining the spatial coordinates of the measurement object on the basis of the image data of the measurement object and the pose data of the mobile computer device.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0057] Exemplary embodiments of the invention are shown in the drawings and are explained in greater detail in the following description. In the figures:
[0058]
[0059]
[0060]
[0061]
[0062]
[0063]
[0064]
DESCRIPTION OF PREFERRED EMBODIMENTS
[0065]
[0066] The system 10 is illustrated schematically in
[0067] Instead of a large, structurally complex, relatively immobile measurement set-up of a coordinate measuring machine usually used for such tasks, the system 10 according to the disclosure for measuring the spatial coordinates of the measurement object 14 is comparatively small and capable of mobile use. The system 10 comprises a mobile computer device 20. Said mobile computer device 20 is preferably a tablet computer. By way of example, it is possible to use an iPad Air 2 WiFi plus cellular™, since this device combines a large number of the functions which are required for the system 10 according to the disclosure and are explained below. In principle, however, the use of a smartphone or laptop is also conceivable.
[0068] A first exemplary embodiment of such a mobile computer device 20 is illustrated schematically in
[0069] In the present exemplary embodiment, the mobile computer device 20 furthermore also comprises a display 24 and a further optical sensor 26, which is designated as third optical sensor 26 in the present case. The third optical sensor 26 is preferably arranged on the same side of the mobile computer device 20 as the display 24. By contrast, the first optical sensor 22 is preferably arranged on the opposite side, such that the optical sensors 22, 26 have opposite viewing directions, as is illustrated schematically with the aid of the arrows 28 (see
[0070] A further component part of the system 10 according to the disclosure is a pose determination unit 30 comprising a tracking sensor 32 for capturing data with regard to the position and location of the mobile computer device 20. Said tracking sensor 32 is embodied as an external tracking sensor, that is to say that it is not integrated into the mobile computer device 20, but rather tracks the position and location thereof externally. The external tracking sensor 32 preferably comprises one or more cameras. In the exemplary embodiment illustrated schematically in
[0071]
[0072] Therefore, the spatial position and location (pose) of the image recording system 20 are thus known unambiguously and at every point in time since they are supplied by the external tracking sensor 32. The additional assumption of the time invariance of the measurement object 14 then allows correction of the imaging differences (such as e.g. different working distance and hence magnification or reduction) and the imaging aberrations (such as e.g. distortion) in the individual images of an image sequence that are supplied by the first optical sensor 22, and creation of a continuous and accurate 3D reconstruction of the measurement object 14 from the entire corrected image sequence. For this purpose, by way of example, at least two images are recorded from the measurement object 14 with the aid of the first optical sensor 22, wherein these two images are recorded in different positions and/or locations of the mobile computer device 20 and thus also of the first optical sensor 22. The position and location of the mobile computer device 20 (and thus also of the first optical sensor 22) at the point in time of capturing the two images mentioned can be ascertained exactly on the basis of the image data obtained by the cameras 34, 36. Size and position changes of the imaging of the measurement object 14 from one of the two images to the other can then be linked with the ascertained position and location change of the first optical sensor 22, such that ultimately the real dimensions within the two images captured by the first optical sensor 22 can be determined unambiguously as a result. The control unit 38 is preferably configured ultimately to calculate a 3D point cloud of the measurement object 14 with the aid of the method mentioned above, wherein the coordinates of these points can be represented in an unambiguously defined, known coordinate system. Measurements with an accuracy in the range of one or a few micrometers are possible in this way.
[0073] Instead of an embodiment of the control unit 38 as a component completely integrated into the mobile computer device 20, an embodiment is likewise conceivable in which at least parts of the control unit 38 and/or of the data storage are implemented in an external computer or in a cloud.
[0074] The exemplary embodiment illustrated in
[0075] The above-explained measurement principle of the system 10 according to the disclosure can be optimized with regard to the precision thereof with the aid of a multiplicity of further system features. In order to simplify the optical capture of the position and location of the mobile computer device 20, for example a plurality of optical markers 48 can be represented on the display 24. In accordance with one exemplary embodiment of the present disclosure, the shape and/or position of said optical markers 48 on the display 24 can be changed in a predefined manner over time. This enables for example an automated, temporal synchronization of the external tracking sensor 32 (cameras 34, 36) and of the mobile computer device 20 with the first optical sensor 22 incorporated therein. Likewise, it is also possible, however, to change the optical markers 48 represented on the display 24 depending on the position and location of the mobile computer device 20. For this purpose, the mobile computer device 20 preferably comprises an internal pose determination sensor 50 (see
[0076]
[0077] In comparison with the embodiments illustrated schematically in
[0078] A further possibility for application of the third optical sensor 26 in the system 10 according to the disclosure is as follows: The image data captured by the third sensor 26 can also be evaluated as to whether the cameras 34, 36 are visible in said image data. The consideration on which this type of evaluation is based consists in the fact that lack of visibility of the cameras 34, 36 in the image data captured by the third optical sensor 26 is a strong indication that the cameras 34, 36 also do not have an unrestricted view of the mobile computer device 20. If such a case is detected, the control unit 38 can discard the corresponding image data of one or both cameras 34, 36. This saves data capacity and increases the robustness of the position and location determination.
[0079] Further exemplary embodiments of the mobile computer device 20 and of the system 10 according to the disclosure are illustrated schematically in
[0080] In particular telecentric clip-on optical units 54, 56 are advantageous for metrological applications or dimensional measurement. Telecentric optical units fundamentally differ from the customary optical units installed in mobile computer devices 20. Objects are imaged with a distance-independent scale by telecentric optical units. This is ideal for the measurement since uncertainties in the positioning of the imaging system do not translate into uncertainties of the imaging scale which directly limit the achievable measurement accuracy.
[0081] Furthermore, the clip-on optical units 54, 56 can also be configured such that a so-called plenoptic or light field recording becomes possible with the sensors 22′, 26′ of the mobile computer device 20. Particularly chips having an extremely high number of pixels (greater than 40 MPx), such as are becoming increasingly widespread in many mobile computer devices 20, offer a good basis for this. The “transformation” of the normal terminal cameras 22, 26 into a plenoptic camera has the advantage that the computer device 20 is directly able to generate 3D information about the measurement object 14 from individual recordings. Accordingly, with the use of a plenoptic camera in combination with the spatial and location ascertaining effected by the external tracking sensor 32, there is an increase in the stability and/or accuracy of the measurement with the aid of the system 10 according to the disclosure by means of which the 3D contour of the imaged measurement object 14 is reconstructed from the image data.
[0082] Furthermore, as indicated schematically in
[0083] Instead of or in addition to the clip-on optical units 54, 56 pushed onto the sensors 22′, 26′, a corresponding clip-on optical unit 60 can also be placed onto the display 24 (see
[0084] With the aid of such clip-on optical units 60 it would also be possible to use the display 24 as illumination for the measurement object 14. This is illustrated schematically in the situation depicted in
[0085] Further sensors of the mobile computer device 20 may support the system 10 according to the disclosure as follows: The identity of the user 12 can be captured via a face recognition or fingerprint sensor. If appropriate, as a result it is possible to load preset user parameters from archives. The identity of the user 12 may equally well be stored together with the measurement results in corresponding databases in an automated manner. Furthermore, it is possible to capture the motor characteristics or idiosyncrasies of the user 12 in order, depending thereon, to examine the quality of the measurement results or speeds of measurements and to relate the latter to use and/or trajectory parameters and/or environmental parameters. Measurement sequences can possibly be optimized as a result. Feedback messages or instructions can equally well be passed on to the user 12 by being output acoustically with the aid of a loudspeaker or being passed on to the user 12 in tactile form with the aid of vibration actuators or being displayed to the user 12 via the display 24.
[0086] Overall, a multiplicity of application possibilities are thus conceivable with the system 10 according to the disclosure. The system 10 according to the disclosure essentially affords the advantage, however, that a relatively exact coordinate measuring machine which is exceptionally capable of mobile use can be simulated with commercially available standard components.