ARRANGEMENT AND METHOD FOR INCREASING THE MEASUREMENT ACCURACY IN THE THREE-DIMENSIONAL MEASUREMENT OF OBJECTS
20230129186 · 2023-04-27
Inventors
- Simon Placht (Baiersdorf, DE)
- Christian Schaller (Erlangen, DE)
- Michael Balda (Baiersdorf, DE)
- Bjoern Kayser (Berlin, DE)
Cpc classification
G01B11/00
PHYSICS
H04N23/53
ELECTRICITY
International classification
Abstract
An arrangement for increasing measurement accuracy in three-dimensional measurement of objects, and a method for three-dimensional measurement of objects. The arrangement has a base for placing thereon an object to be measured, a device for emitting light beams toward the object, a device for detecting light beams reflected by the object and the substrate, and a device for determining 3D data based on travel time measurements and/or triangulation of the emitted light beams and reflected light beams, and for determining dimensions of the outer shell of the object. A measurement inaccuracy range affected by interferences and forming on the substrate is bridged and spaced from the actual measurement position by a transparent plate with a predetermined thickness arranged between the at least one object and the substrate, so that the determination of dimensions of the outer shell of the object is only partially possible and becomes more precise.
Claims
1. An arrangement for three-dimensional measurement of objects comprising: a planar substrate for placing thereon at least one object to be measured; an emitting device for emitting light beams in the direction of the at least one object; a receiving device for detecting the light beams reflected from the at least one object and the substrate; and a determining device for determining 3D data on the basis of time-of-flight measurements and/or triangulation of the emitted and reflected light beams and for determining the dimensions of the outer shell of the at least one object from the determined 3D data; a transparent plate having a predetermined thickness for placement between the at least one object and the substrate, the thickness being higher than a measurement inaccuracy range forming on the substrate, the thickness being in a range between 1 and 50 millimeters wherein the determining device corrects the determined 3D data or the determined dimensions of the outer shell of the at least one object by the thickness of the transparent plate (8), in that the determining device first determines 3D data of the at least one object to be measured together with the transparent plate and the substrate, which serves as reference surface; wherein the determining device determines a virtual, planar separation plane running parallel to the reference surface and spaced apart from the latter by at least the thickness; the determining device separates and eliminates the determined 3D data between the separation plane and the reference surface from the remaining determined 3D data; and the determining device determines the dimensions of the outer shell of the at least one object on the basis of the remaining 3D data.
2. The arrangement for three-dimensional measurement of objects according to claim 1, wherein the transparent plate is a plastic plate.
3. The arrangement for three-dimensional measurement of objects according to claim 1, wherein the thickness of the transparent plate is in a range between 1 and 8 millimeters.
4. The arrangement for three-dimensional measurement of objects according to claim 1, wherein the arrangement comprises at least one adjustable lifting column for spacing the transparent plate in height direction by a distance in a range of 0.1-20 millimeters from the substrate.
5. The arrangement for three-dimensional measurement of objects according to claim 1, wherein the arrangement comprises an input device for manual or automated input of the thickness of the transparent plate and/or an input interface for receiving the distance between the transparent plate and the substrate measured by a measuring device.
6. The arrangement for three-dimensional measurement of objects according to claim 1, wherein: the emitting device, the receiving device and the determining device for the time-of-flight measurement of the emitted and reflected light beams are designed as a time-of-flight camera and/or as a LiDAR camera; or the emitting device, the receiving device and the determining device for the triangulation of the emitted and reflected light beams are designed as a structured light camera and/or as a stereo camera.
7. The arrangement for three-dimensional measurement of objects according to claim 1, wherein the substrate has an opaque surface and/or the transparent plate has, on the side facing the substrate, an opaque, light-reflecting, which reflects light falling through the transparent plate.
8. The arrangement for three-dimensional measurement of objects according to claim 7, wherein the transparent plate has on the side facing the substrate a diffuse reflecting layer.
9. The arrangement for three-dimensional measurement of objects according to claim 8, wherein the transparent plate has on the side facing the substrate a diffuse reflecting layer which is a plastic hard foam layer.
10. The arrangement for three-dimensional measurement of objects according to claim 1, wherein the transparent plate has at least one layer of electrochromic glass.
11. The arrangement for three-dimensional measurement of objects according to claim 10, wherein the at least one layer of electrochromic glass has an adjustable transmittance.
12. The arrangement for three-dimensional measurement of objects according to claim 1, wherein the transparent plate has at least one replaceable transparent layer, as protective film, made of plastic, which protects the transparent plate from surface damage, like scratches, so that the optical properties of the transparent plate remain unchanged.
13. The arrangement for three-dimensional measurement of objects according to claim 1, wherein the transparent plate has polarizing properties.
14. The arrangement for three-dimensional measurement of objects according to claim 1, wherein the substrate is formed by a supporting surface of a balance.
15. The arrangement for three-dimensional measurement of objects according to claim 1, wherein the transparent plate is designed as part of an electrically controlled display.
16. The arrangement for three-dimensional measurement of objects according to claim 15, wherein the electrically controlled display is a LCD or OLED display.
17. A method for three-dimensional measurement of objects, comprising the steps of: placing at least one object to be measured on a planar substrate; emitting light beams in the direction of the at least one object and detecting the light beams reflected by the at least one object and the substrate; determining 3D data based on time-of-flight measurements and/or triangulation of the emitted and reflected light beams; and determining the dimensions of the outer shell of the at least one object from the determined 3D data, arranging a transparent plate with a predetermined thickness between the at least one object and the substrate, the thickness being greater than a measurement inaccuracy range forming on the substrate, the thickness being in a range between 1 and 50 millimeters; correcting the determined 3D data or the determined dimensions of the outer shell of the at least one object by the thickness of the transparent plate, determining 3D data of the at least one object to be measured together with the transparent plate and the substrate serving as reference surface; determining a virtual, planar separation plane running parallel to the reference surface and spaced from it by the thickness; separating and eliminating the determined 3D data between the separation plane and the reference surface from the remaining determined 3D data; and determining the dimensions of the outer shell of the at least one object from the remaining 3D data.
18. The method for three-dimensional measurement of objects according to claim 17, wherein the transparent has a thickness of between 1 and 8 millimeters.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0055] The invention is explained in more detail below with reference to preferred embodiments with the aid of drawings.
[0056]
[0057]
[0058]
[0059]
[0060]
[0061]
[0062]
[0063]
[0064]
[0065]
[0066]
DETAILED DESCRIPTION
[0067] Examples of embodiments of the present disclosure are described below on the basis of the accompanying figures. The same elements are indicated by the same reference signs. Features of the individual embodiments may be interchanged with each other.
[0068]
[0069] As can be seen from
[0070] The measurement inaccuracy range G is caused by different interference effects and varies depending on the measuring unit 7 used. In the case of measuring units 7 that use TOF or LIDAR cameras for the measurement process, the interference effects include, for example, temporal noise and multiple reflections. In the case of TOF cameras, there is also stray light.
[0071] In the case of measuring units 7 that use SL or stereo cameras, the interference effects that occur are, for example, temporal noise, errors in the localization of projected pattern areas, and changes in the stereo base.
[0072] As can be seen from
[0073] In order to circumvent the measurement inaccuracy range G and enable precise measurements of the object 3, as shown in
[0074] Through the transparent plate 8, which is not perceptible to the determining device 6, the emitted light rays L continue to pass through almost undisturbed as far as the substrate 2 or the opaque functional layer 9 and are then reflected, so that the measurement inaccuracy range G, which is virtually perceptible to the measuring unit 7, continues to be formed on the substrate 2 or on the opaque functional layer 2, but on the other hand the object 3 to be measured is, as it were, highlighted from the measurement inaccuracy range G and is spaced apart from the substrate 2.
[0075] In other words, the measurement inaccuracy range G is formed within the transparent plate 8, which is not perceptible to the measuring unit 7, while the object 3 to be measured rests on the transparent plate 8 and is thus spaced from the measurement inaccuracy range G.
[0076] During the three-dimensional measurement of the object 3, the determining device 6 first determines 3D data of the object 3 together with the measurement inaccuracy range G and the functional layer 9, which serves as the reference surface R. At the same time, the determining device 6 also determines a virtual, planar separation plane S that runs parallel to the reference surface and is spaced apart from it by at least the thickness D. Subsequently, the determining device 6 separates the determined 3D data between separation plane S and reference surface R from the remaining determined 3D data and eliminates the 3D data from the intermediate area of the reference surface R and the separation plane S. Thereupon, the determining device 6 precisely determines the dimensions of the outer shell of the 3D object based on the remaining 3D data.
[0077] Here, the thickness D of the transparent plate 8 can be precisely input via an input device 10. The input device 10 can be designed as a PC. A data carrier on which a configuration file with corresponding dimensions of the transparent plate 8 is stored can also serve as input device 10.
[0078] The thickness D of the plate 8 is selected here in coordination with the measurement inaccuracy range G, which varies depending on the environmental conditions and the measuring unit 7 used. In principle, the thickness D of the plate 8 is selected so that it is at least a factor of 1.2 higher than the inaccuracy range G occurring with known environmental parameters, in order to space the object 3 to be measured completely outside the measurement inaccuracy range G even in the event of unforeseen fluctuations in the inaccuracy range G, in which the latter is higher than planned.
[0079]
[0080] The lifting columns 11 can be driven in this case, for example, by a servomotor which has a measuring device 12 for recording the travel distance, for example a resolver, incremental encoder or absolute encoder. The travel distance, which corresponds to the distance A, is passed on to the measuring unit 7 via an input interface 13.
[0081] The dimensions of the object to be measured are here calculated to the volume, where the variable x stands for the length, the variable y for the width and the variable z for the measured height H, minus the plate thickness D and the distance A traversed by the lifting column 11, of the object to be measured.
[0082] In other words, note that the height H measured by the measuring device must be corrected with the thickness D of the transparent plate 8 and the distance A adjustable by the lifting columns. Accordingly, the volume calculation formula for the determining device 6 or measuring device 7 of the present invention is:
V=x−y−z=x y (H−(D+A).
[0083]
[0084] In addition to the determining device 6 for acquiring 3D data, the arrangement 1 also has a color camera 15. The color camera 15 is thereby calibrated to the determining device 6, i.e. both a projective image of the color camera 15 and a rigid body transformation between the 3D determining device 6 and color camera 15 are determined.
[0085] The rigid body transformation is used here to describe the position and orientation of the 3D data set acquired by the determining device 6, i.e. its coordinate system, with respect to a reference coordinate system, namely that of the color camera 15.
[0086] With the aid of the rigid body transformation, the outer shell 16 of the recorded object 3 can be transformed to the image plane of the color camera 15.
[0087]
[0088] In this case, the identification of display areas in the color image is done by a temporal analysis of each individual pixel, e.g. by playing a fast black and white sequence on the display 14. In areas 16a of the transformed outer shell 16 protruding to the display 14, these color changes will be clearly visible, whereas in areas belonging to the object 3, no color changes will be visible due to the obscuring of the display 14.
[0089] Both the electric display 14 and the additional color camera 15 are controlled and configured by the input device 10.
[0090] In
[0091] With the switchability of the electrochromic glass 17 from a transparent to an opaque state, measured distance values of the determining device 6 in the transparent state of the electrochromic glass 17, in which the emitted light rays L are reflected at the substrate 2, in conjunction with measured distance values of the opaque switched, electrochromic glass 17, in which the emitted light rays L are reflected at the upper side of the electrochromic glass 17, the determining device 6 determines the thickness D of the transparent plate 8 from the difference between the two distance measurements (in the transparent and opaque state). The electrochromic glass 17 is controlled by the input device 10.
[0092] In
[0093] In
[0094]
[0095]
[0096] First, in step S1, the transparent plate 8 with a predetermined thickness D is placed between at least one object 3 and a substrate 2.
[0097] In step S2, the at least one object 3 to be measured is placed on the planar substrate 2.
[0098] In step S3, light rays L are detected in the direction of the at least one object 3 and light rays reflected from the at least one object 3 and the subsurface 2.
[0099] In step S4, 3D data is obtained based on time-of-flight measurements and/or triangulation of the emitted and reflected light beams.
[0100] In step S5, the 3D data of the object 3 to be at least one including the transparent plate 8 and the subsurface 2 serving as the reference surface R are determined.
[0101] In step S6, a virtual, planar separation plane S is determined, which runs parallel to the reference surface R and is spaced from it by the thickness D.
[0102] In step S7, the determined 3D data between separation plane S and reference surface R are separated from the remaining determined 3D data and eliminated.
[0103] In step S8, the determined 3D data or the determined dimensions of the outer shell of the at least one object 3 are corrected by at least the thickness D of the transparent plate 8.
[0104] In step S9, the dimensions of the outer shell of the at least one object 3 are determined using the remaining 3D data.
[0105] After step S9, the measurement is completed.