Three-dimensional measurement device with annotation features
10719947 · 2020-07-21
Assignee
Inventors
- Reinhard Becker (Stuttgart, DE)
- Martin Ossig (Tamm, DE)
- Joseph A. Arezone (Cleveland Heights, OH, US)
- Gerrit Hillebrand (Waiblingen, DE)
- Rene Pfeiffer (Muehlacker, DE)
- Daniel Döring (Ditzingen, DE)
Cpc classification
B25J9/161
PERFORMING OPERATIONS; TRANSPORTING
G06K7/10297
PHYSICS
G01B11/14
PHYSICS
H04N13/183
ELECTRICITY
H04N13/254
ELECTRICITY
G01S7/481
PHYSICS
G01S17/66
PHYSICS
G06T7/521
PHYSICS
H04N13/243
ELECTRICITY
International classification
G01S3/803
PHYSICS
H04N13/243
ELECTRICITY
G01S17/66
PHYSICS
G01B11/14
PHYSICS
B25J15/04
PERFORMING OPERATIONS; TRANSPORTING
G01B11/00
PHYSICS
G06K7/10
PHYSICS
H04N13/254
ELECTRICITY
H04N13/183
ELECTRICITY
G06T7/521
PHYSICS
Abstract
A three-dimensional (3D) measurement system and method is provided. The system includes a noncontact measurement device, an annotation member and a processor. The noncontact measurement device being operable to measure a distance from the noncontact measurement device to a surface. The annotation member is coupled to the noncontact measurement device. The processor is operably coupled to the noncontact measurement device and the annotation member, the processor operable to execute computer instructions when executed on the processor for determining 3D coordinates of at least one point in a field of view based at least in part on the distance, recording an annotation in response to an input from a user, and associating the annotation with the at least one point.
Claims
1. A three-dimensional (3D) measurement system comprising: a noncontact measurement device operable to measure a distance from the noncontact measurement device to a surface; an annotation member coupled to the noncontact measurement device; a laser light source arranged to emit a visible light beam from the noncontact measurement device; and a processor operably coupled to the noncontact measurement device and the annotation member, the processor operable to execute computer instructions when executed on the processor for determining 3D coordinates of a first point in a field of view based at least in part on the distance, recording an annotation received from a user, and associating the annotation with the first point; wherein the processor is further operable to execute the computer instructions to perform a method comprising: highlighting, by the 3D measurement system the first point on the surface by emitting, by the laser light source, from a first position of the 3D measurement system, the visible light beam onto said first point; locking, by the laser light source, the first point on the surface, in response to an input from the user; and in response to the 3D measurement system moving to a second position, maintaining the highlight of the first point on the surface by emitting the visible light beam onto the first point from the second position of the 3D measurement system.
2. The system of claim 1, wherein the annotation member is a microphone arranged to receive sounds from the user.
3. The system of claim 2, wherein the microphone includes a plurality of microphones, each of the plurality of microphones being coupled to the noncontact measurement device and arranged to receive sound from a different direction.
4. The system of claim 3, wherein the processor is further operable to determine a direction that the sound is generated based on signals from the plurality of microphones.
5. The system of claim 2, wherein the processor is further operable to execute a process on the noncontact measurement device in response to a sound received by the microphone from the user.
6. The system of claim 1, wherein the processor is further operable to perform a method comprising: in a first instance emitting the visible light beam onto the first point in response to a first input from the user; in a second instance emitting the visible light beam onto a second point in response to a second input from the user; and comparing measurement data around the first point and the second point based at least in part on the 3D coordinates of the first point and the second point.
7. The system of claim 1, wherein the noncontact measurement device includes a camera with the field of view, and wherein the processor is further operable to perform a method comprising: emitting the visible light beam onto a surface in response to an input from the user; acquiring an image of the surface with a spot of light from the visible light beam; and determining 3D coordinates of the spot of light on the surface based at least in part on a baseline distance between the camera and the laser light source.
8. The system of claim 1, wherein the annotation member further includes a microphone, and wherein the processor is further operable to perform a method comprising: emitting the visible light beam onto a surface in response to the input from the user; acquiring an image of the surface with a spot of light from the visible light beam; recording a sound from the user; and associating the recording of the sound with the image.
9. The system of claim 1, wherein the noncontact measurement device includes a camera with the field of view, and wherein the processor is further operable to perform a method comprising: emitting the visible light beam onto a surface in response to the input from the user; acquiring an image of the surface with a spot of light from the visible light beam; and receiving a computer-aided design (CAD) model of an object being scanned; and defining a reference point on the CAD model based at least in part on the image.
10. The system of claim 1, wherein the noncontact measurement device includes a camera with the field of view, and wherein the processor is further operable to perform a method comprising: defining a reference point in a computer-aided-design model; and emitting the visible light beam onto a feature of an object in response to the input from the user, the feature being based at least in part on the reference point.
11. A method comprising: acquiring point data about a plurality of points on a surface with a noncontact measurement device; determining 3D coordinates of the plurality of points based at least in part on the point data; in a first instance, recording annotation data with an annotation member in response to an input from a user, the annotation member being coupled to the noncontact measurement device; associating, interactively, the annotation data with at least a portion of the 3D coordinates of points, wherein the portion of the 3D coordinates of points were determined from the point data acquired contemporaneously with the annotation data, wherein the associating comprises: highlighting the portion of the 3D coordinates by emitting, by a laser light source, from a first position of the noncontact measurement device, a visible light beam onto a first point from the portion of the 3D coordinates; locking the first point, in response to the input from the user; and in response to the noncontact measurement device moving to a second position, maintaining the highlight of the first point by emitting the visible light beam onto said first point from the second position of the noncontact measurement device.
12. The method of claim 11, wherein the annotation member includes a microphone and the recording of annotation data includes recording sound data.
13. The method of claim 12, wherein the sound data includes an operator's voice.
14. The method of claim 12, wherein the annotation member includes a plurality of microphones, each of the microphones arranged to receive sound from a different direction.
15. The method of claim 14, further comprising determining a direction of the sound based at least in part on signals from each of the plurality of microphones.
16. The method of claim 12, further comprising executing a process on the noncontact measurement device based at least in part on the sound data.
17. The method of claim 11, wherein the annotation member includes the laser light source that emits the visible light beam and recording annotation data further comprising: emitting the visible light beam onto a surface in an environment; acquiring an image of a spot of light on the surface generated by the visible light beam.
18. The method of claim 17, wherein the recording the annotation data further comprises: in a first instance emitting the visible light beam onto the first point in response to a first input from the user; in a second instance emitting the visible light beam onto a second point in response to a second input from the user; and comparing measurement data around the first point and the second point based at least in part on the 3D coordinates of the first point and the second point.
19. The method of claim 17, wherein the recording annotation data further comprises determining 3D coordinates of the spot of light on the surface based at least in part on a baseline distance between a camera and the laser light source.
20. The method of claim 17, wherein the recording annotation data further comprises: recording a sound from the user; and associating the recording of the sound with the image.
21. The method of claim 17, wherein the recording annotation data further comprises: defining a reference point in a computer-aided-design model; receiving by the noncontact measurement device the computer-aided design (CAD) model of an object being scanned; and emitting the visible light beam onto a feature of the object in response to the input from the user, the feature being based at least in part on the reference point.
22. The method of claim 17, wherein: the emitting of the visible light beam includes emitting the visible light beam onto a feature of the surface; and acquiring a feature image of the feature with the spot of light from the visible light beam.
Description
BRIEF DESCRIPTION OF DRAWINGS
(1) The subject matter, which is regarded as the disclosure, is particularly pointed out and distinctly claimed in the claims at the conclusion of the specification. The foregoing and other features, and advantages of the disclosure are apparent from the following detailed description taken in conjunction with the accompanying drawings in which:
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9) The detailed description explains embodiments of the disclosure, together with advantages and features, by way of example with reference to the drawings.
DETAILED DESCRIPTION
(10) Embodiments of the invention provide for a three-dimensional (3D) measurement device that acquires annotation data that is registered with the coordinate data. Embodiments of the invention provide for the recording of sounds, such as the operators voice or background noise. Still further embodiments provide for the emitting of a visible light beam that can be used for marking locations in the environment that are being scanned, or for measuring distances between two marked points.
(11) Referring now to
(12) As discussed in more detail herein, in an embodiment the projector 24 projects a pattern of light onto a surfaces in the environment. As used herein, the term projector is defined to generally refer to a device for producing a pattern. The generation of the pattern can take place by means of deflecting methods, such as generation by means of diffractive optical elements or micro-lenses (or single lasers), or by shading methods, for example the production by means of shutters, transparencies (as they would be used in a transparency projector) and other masks. The deflecting methods have the advantage of less light getting lost and consequently a higher intensity being available.
(13) The cameras 26, 28 acquire images of the pattern and in some instances able to determine the 3D coordinates of points on the surface using trigonometric principles, e.g. epipolar geometry.
(14) It should be appreciated that while the illustrated embodiments show and describe the device that determines 3D coordinates as being an image scanner, this is for exemplary purposes and the claimed invention should not be so limited. In other embodiments, devices that use other noncontact means for measuring 3D coordinates may also be used, such as a laser scanner device that uses time-of-flight to determine the distance to the surface.
(15) A controller 48 is coupled for communication to the projector 24, cameras 26, 28, 40 and in an embodiment the annotation device 47. The connection may be a wired-connection/data-transmission-media 50 or a wireless connection. The controller 48 is a suitable electronic device capable of accepting data and instructions, executing the instructions to process the data, and presenting the results. Controller 48 may accept instructions through user interface 52, or through other means such as but not limited to electronic data card, voice activation means, manually-operable selection and control means, radiated wavelength and electronic or electrical transfer.
(16) Controller 48 uses signals act as input to various processes for controlling the system 20. The digital signals represent one or more system 20 data including but not limited to images acquired by cameras 26, 28, 40, temperature, ambient light levels, operator inputs via user interface 52 and the like.
(17) Controller 48 is operably coupled with one or more components of system 20 by data transmission media 50. Data transmission media 50 includes, but is not limited to, twisted pair wiring, coaxial cable, and fiber optic cable. Data transmission media 50 also includes, but is not limited to, wireless, radio and infrared signal transmission systems. Controller 48 is configured to provide operating signals to these components and to receive data from these components via data transmission media 50.
(18) In general, controller 48 accepts data from cameras 26, 28, 40, projector 24 and light source 47, and is given certain instructions for the purpose of determining the 3D coordinates of points on surfaces being scanned. The controller 48 may compare the operational parameters to predetermined variances and if the predetermined variance is exceeded, generates a signal that may be used to indicate an alarm to an operator or to a remote computer via a network. Additionally, the signal may initiate other control methods that adapt the operation of the system 20 such as changing the operational state of cameras 26, 28, 40, projector 24 or light source 42 to compensate for the out of variance operating parameter.
(19) The data received from cameras 26, 28, 40 may be displayed on a user interface 52. The user interface 52 may be an LED (light-emitting diode) display, an LCD (liquid-crystal diode) display, a CRT (cathode ray tube) display, a touch-screen display or the like. A keypad may also be coupled to the user interface for providing data input to controller 48. In an embodiment, the controller 48 displays in the user interface 52 a point cloud to visually represent the acquired 3D coordinates.
(20) In addition to being coupled to one or more components within system 20, controller 48 may also be coupled to external computer networks such as a local area network (LAN) and the Internet. A LAN interconnects one or more remote computers, which are configured to communicate with controller 48 using a well-known computer communications protocol such as TCP/IP (Transmission Control Protocol/Internet({circumflex over ()}) Protocol), RS-232, ModBus, and the like. Additional systems 20 may also be connected to LAN with the controllers 48 in each of these systems 20 being configured to send and receive data to and from remote computers and other systems 20. The LAN is connected to the Internet. This connection allows controller 48 to communicate with one or more remote computers connected to the Internet.
(21) Controller 48 includes a processor 54 coupled to a random access memory (RAM) device 56, a non-volatile memory (NVM) device 58, a read-only memory (ROM) device 60, one or more input/output (I/O) controllers, and a LAN interface device 62 via a data communications bus.
(22) LAN interface device 62 provides for communication between controller 48 and a network in a data communications protocol supported by the network. ROM device 60 stores an application code, e.g., main functionality firmware, including initializing parameters, and boot code, for processor 54. Application code also includes program instructions as shown in
(23) NVM device 58 is any form of non-volatile memory such as an EPROM (Erasable Programmable Read Only Memory) chip, a disk drive, or the like. Stored in NVM device 58 are various operational parameters for the application code. The various operational parameters can be input to NVM device 58 either locally, using a user interface 52 or remote computer, or remotely via the Internet using a remote computer. It will be recognized that application code can be stored in NVM device 58 rather than ROM device 60.
(24) Controller 48 includes operation control methods embodied in application code such as that shown in
(25) As will be discussed in more detail herein, the controller 48 may be configured to receive audio annotation data and associate/integrate the annotation data with the three-dimensional coordinate data. In an embodiment, the audio annotation data is a spoken voice. In one embodiment, the controller 48 includes computer instructions written to be executed by processor 54 to translate the audio annotation data from a first spoken language to a second spoken language.
(26) In an embodiment, the controller 48 further includes an energy source, such as battery 64. The battery 64 may be an electrochemical device that provides electrical power for the controller 48. In an embodiment, the battery 64 may also provide electrical power to the cameras 26, 28, 40, the projector 24 and the annotation device 47. In some embodiments, the battery 64 may be separate from the controller (e.g. a battery pack). In an embodiment, a second battery (not shown) may be disposed in the housing 36 to provide electrical power to the cameras 26, 28, 40 and projector 24. In still further embodiments, the light source 42 may have a separate energy source (e.g. a battery pack).
(27) In one embodiment, the controller 48 may include a microphone 65. As discussed further herein, the microphone 65 may be used to annotate the point cloud data acquired during the scanning process. In an embodiment, the microphone 65 may be used by itself, or in combination with an annotation device 47.
(28) It should be appreciated that while the controller 48 is illustrated as being separate from the housing 36, this is for exemplary purposes and the claims should not be so limited. In other embodiments, the controller 48 is integrated into the housing 36. Further, while embodiments herein illustrate the controller 48 as being coupled with a single image scanner 22, this is for exemplary purposes and the claims should not be so limited. In other embodiments, the controller 48 may be coupled to and combine three-dimensional coordinate data and annotation data from multiple image scanners 22.
(29) In the illustrated embodiment, the projector 24 and cameras 26, 28 are arranged spaced apart in a triangular arrangement where the relative distances and positions between the components is known. The triangular arrangement is advantageous in providing information beyond that available for two cameras and a projector arranged in a straight line or from a system with a projector and a single camera. The additional information may be understood in reference to
(30) In
(31)
(32) Consider the embodiment of
(33) To check the consistency of the image point P.sub.1, intersect the plane P.sub.3-E.sub.31-E.sub.13 with the reference plane 108 to obtain the epipolar line 114. Intersect the plane P.sub.2-E.sub.21-E.sub.12 to obtain the epipolar line 116. If the image point P.sub.1 has been determined consistently, the observed image point P.sub.1 will lie on the intersection of the determined epipolar line 114 and line 116.
(34) To check the consistency of the image point P.sub.2, intersect the plane P.sub.3-E.sub.32-E.sub.23 with the reference plane 110 to obtain the epipolar line 105. Intersect the plane P.sub.1-E.sub.12-E.sub.21 to obtain the epipolar line 107. If the image point P.sub.2 has been determined consistently, the observed image point P.sub.2 will lie on the intersection of the determined epipolar lines 107 and 105.
(35) To check the consistency of the projection point P.sub.3, intersect the plane P.sub.2-E.sub.23-E.sub.32 with the reference plane 110 to obtain the epipolar line 118. Intersect the plane P.sub.1-E.sub.13-E.sub.31 to obtain the epipolar line 120. If the projection point P.sub.3 has been determined consistently, the projection point P.sub.3 will lie on the intersection of the determined epipolar line 118 and line 120.
(36) The redundancy of information provided by using a 3D imager 100 having a triangular arrangement of projector and cameras may be used to reduce measurement time, to identify errors, and to automatically update compensation/calibration parameters. It should be appreciated that based on the epipolar geometry relationships described herein, the distance from the image scanner 22 to points on the surface being scanned may be determined. By moving the image scanner 22, the determination of the pose/orientation of the image scanner, and a registration process the three dimensional coordinates of locations (point data) on a surface may be determined.
(37) Referring now to
(38) In the embodiment of
(39) It should be appreciated that the ability to record sound, such as verbal notes spoken by the user provides advantages in that it allows the user to integrate the annotations with the 3D coordinate data, whereas previously these notes would have been kept separately. Further, the separate, and usually handwritten, notes would not have the positional association within the 3D coordinate data that embodiments described herein provide.
(40) In still further embodiments, the microphones 130 may be used to control the image scanner 22 with audio commands. In this embodiment, in response to an operator input, the microphone 130 receives spoken/audible words from the operator. The controller 48 matches the received words with a list of predetermined commands and initiate control sequences or procedures in response.
(41) Referring now to
(42) It should be appreciated when the visible beam of light 138 intersects or strikes a surface, such as surface 140 shown in
(43) In one embodiment, the operator may use the spot of light 142 to indicate cardinal points that will be used by the controller for registration of the images acquired by the image scanner 22 during the scanning process. In still another embodiment, the controller 48 receives a computer-aided-design (CAD) model of an object being scanned and the spot of light 142 marked by the operator defines a reference point for the CAD model.
(44) In an embodiment, the laser light source 134 includes a steering mechanism 144. The steering mechanism 144 may be a galvomirror or a digital micromirror device for example. The steering mechanism 144 allows the direction that the light beam 138 to be changed relative to the front side 135. In an embodiment, the direction of the light beam 138 may be dynamically changed as the image scanner 22 is moved or its pose changed. In another embodiment, the direction of the light beam 138 may be changed by the operator, such as through the controller 48 for example.
(45) In one embodiment, the operator may direct the light beam 138 to create a spot of light 142 in a desired position. The operator, such as via controller 48 for example, may lock the light beam 138 on that spot. Then as the position or pose of the image scanner 22 changes, the steering mechanism 144 automatically changes the direction of the light beam 138 to maintain the spot of light 142 in the locked on location. It should be appreciated that the spot of light 142 will then appear in multiple images acquired by color camera 140.
(46) Referring now to
(47) In one embodiment, the operator may move or change the pose of the image scanner 22 between marking the first location with the spot of light 146 and the second location with a spot of light 142. In another embodiment, the laser light source 134 includes the steering mechanism 144 and the operator may use the steering mechanism to change the direction of the light beam 138. It should be appreciated that the locations of spots of light 146, 142 do not need to be within the same field of view of the cameras 26, 28, 40 for the distance between the locations of the spots of light 146, 142 to be measured.
(48) In still another embodiment, the image scanner 22 may determine the three-dimensional coordinates of the locations of the spots of light 146, 142 directly using the laser light source 134 and the color camera 40. It should be appreciated that the laser light source and color camera 40 may be device 1 and device 2 of
(49) Referring now to
(50) The first process 206 includes the acquiring of an images (or multiple images) of the light pattern projected by projector 24 with cameras 26, 28 in block 210. The system 20 then proceeds to block 212 where the three-dimensional coordinates of the scanned area are determined. In some embodiments, the acquisition of three-dimensional coordinates of an object may involve scanning the object from multiple locations where the image scanner 22 is moved relative to the object (or the object is moved relative to the image scanner). In these embodiments, the first process proceeds to block 213 where the three-dimensional coordinate data from the multiple scanning positions is registered. It should be appreciated that the registration of the three-dimensional coordinate data may be performed during the scanning process (simultaneously with or contemporaneously with the determination of the three-dimensional coordinates) or may be performed after the object is scanned.
(51) The second process 208 initiates at query block 214 where it is determined whether the operator desires to record any annotations. In an embodiment, the image scanner 22 may have an actuator or switch (not shown) on the handle 132 that the operator actuates to initiate the annotations. When the query block 214 returns a positive, the second process 208 proceeds to block 216 where at least one of a position, orientation/pose or time is recorded. It should be appreciated that this may be used to register the annotation data to the three-dimensional coordinate data.
(52) The second process 208 then proceeds to block 218 where the annotation data 220 is recorded. The annotation data may include, but is not limited to, audio notes 222, background audio 224, and measurements between user defined locations 226 for example. The second process 208 then proceeds to node 228 and merges with the first process 206. The method 200 then proceeds to block 230 where the annotation data 220 is registered with the three-dimensional coordinate data.
(53) It should be appreciated that the first process 206 and the second process 208 may be performed multiple times during the scanning process and the single instance illustrated in
(54) Further, it should be appreciated that while embodiments herein describe the annotation data with respect to recording sound or marking locations with light, this is for exemplary purposes and the claims should not be so limited. In other embodiments, other types or sources of data may be integrated with the three-dimensional coordinate data to annotate the point cloud. Other annotation data may include data received via LAN/internet interface device 62, such as but not limited to measurement data, documentation data, hyperlinks, and web addresses for example.
(55) The term about is intended to include the degree of error associated with measurement of the particular quantity based upon the equipment available at the time of filing the application. For example, about can include a range of 8% or 5%, or 2% of a given value.
(56) The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used herein, the singular forms a, an and the are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms comprises and/or comprising, when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, element components, and/or groups thereof.
(57) While the disclosure is provided in detail in connection with only a limited number of embodiments, it should be readily understood that the disclosure is not limited to such disclosed embodiments. Rather, the disclosure can be modified to incorporate any number of variations, alterations, substitutions or equivalent arrangements not heretofore described, but which are commensurate with the spirit and scope of the disclosure. Additionally, while various embodiments of the disclosure have been described, it is to be understood that the exemplary embodiment(s) may include only some of the described exemplary aspects. Accordingly, the disclosure is not to be seen as limited by the foregoing description, but is only limited by the scope of the appended claims.