System and Method for Surface Profiling
20200151848 · 2020-05-14
Assignee
Inventors
- Kent Kahle (Hayward, CA, US)
- Vinod Khare (Westminster, CO, US)
- Sascha KORL (Buchs, CH)
- Andreas Winter (Feldkirch, AT)
Cpc classification
G06T3/08
PHYSICS
G06T7/521
PHYSICS
G06T17/20
PHYSICS
International classification
G06T17/20
PHYSICS
Abstract
A system and method for surface profiling via a projection system, such as a position enabled projector. By way of example, a three-dimensional representation of a physical object, such as an uneven surface of the object, may be generated and profiled. The three-dimensional representation may be a 3D point cloud, a surface mesh, or any other suitable type of representation. A two-dimensional image to be projected onto the surface may undergo an image transformation based on the generated 3D representation of the surface. The transformed image is then projected onto the surface, where the image points projected are at their true positions with true scale. Moreover, the projected image may be automatically updated when the projector is moved to a new position.
Claims
1.-21. (canceled)
22. A system for surface profiling, comprising: at least one processor for executing stored instructions to: capture geometric characteristics of an object, the geometric characteristics including a surface of the object, generate a three-dimensional (3D) point cloud of the object based on the captured geometric characteristics of the object, generate a 3D polygon mesh of the object based on the generated 3D point cloud, transform a first two-dimensional (2D) image to a second 2D image based on the generated 3D polygon mesh, and project the second 2D image onto the surface of the object.
23. The system of claim 22, further comprising one or more of: (i) at least one camera, (ii) at least one laser scanner, (iii) at least one surface profiling sensor, and (iv) at least one projector, wherein the at least one camera, the at least one scanner, the at least one surface profiling sensor, and the at least one projector are configured to profile the surface of the object.
24. The system of claim 23, wherein the at least one camera includes a time-of-flight camera.
25. The system of claim 23, further comprising one or more range meters configured to measure a distance between the one or more range meters and the object.
26. The system of claim 22, wherein the geometric characteristics of the object are captured based on at least one light pattern projected onto the surface and capturing the at least one projected light pattern using one or more cameras.
27. The system of claim 26, wherein the 3D point cloud of the object is generated based on the at least one captured light pattern.
28. The system of claim 22, wherein the 3D polygon mesh is generated based on Poisson surface reconstruction.
29. The system of claim 22, wherein the transformation is based on an affine transformation.
30. The system of claim 22, wherein the 3D polygon mesh includes a plurality of polygons arranged to virtually exhibit an overall shape of the object, wherein the plurality of polygons are triangles.
31. The system of claim 30, wherein the transformation of the first 2D image to the second 2D image further includes the at least one processor executing stored instructions to: locate a first triangle in an orthoimage of the first 2D image and a second triangle in the 3D polygon mesh corresponding to the first triangle, determine an affine transformation between the first and second triangles, and generate a third triangle based on the transformation.
32. The system of claim 22, wherein the surface of the object is an uneven surface.
33. The system of claim 32, wherein the projection of the second 2D image onto the uneven surface of the object is such that all characteristics of the first 2D image are positioned on the uneven surface with true location and true scale.
34. The system of claim 22, wherein the system for surface profiling is included in a position enabled projector.
35. The system of claim 34, wherein the transformation is further based on one or more of: (i) a position of the position enabled projector and (ii) an orientation of the position enabled projector.
36. The system of claim 23, wherein the at least one camera is mechanically secured relative to the at least one projector.
37. The system of claim 22, wherein the first 2D image is a blueprint of a construction-related task.
38. The system of claim 22, further comprising at least two cameras, at least one projector, and at least three range meters, wherein the at least three range meters are configured to measure a distance between each respective range meter and the object.
39. The system of claim 34, wherein the projection of the second 2D image onto the surface of the object is automatically updated, when the position enabled projector is moved based on an updated transformation.
40. A method for surface profiling, comprising the steps of: capturing, by at least one processor, geometric characteristics of an object, the geometric characteristics including a surface of the object; generating, by the at least one processor, a three-dimensional (3D) point cloud of the object based on the captured geometric characteristics of the object; generating, by the at least one processor, a 3D polygon mesh of the object based on the generated 3D point cloud; transforming, by the at least one processor, a first two-dimensional (2D) image to a second 2D image based on the generated 3D polygon mesh; projecting, by the at least one processor, the second 2D image onto the surface of the object; and using the projected second 2D image in a construction task.
41. A non-transitory computer-readable medium comprising a set of executable instructions, the set of executable instructions when executed by at least one processor causes the at least one processor to perform a method for surface profiling, the method comprising the steps of: capturing geometric characteristics of an object, the geometric characteristics including a surface of the object; generating a three-dimensional (3D) point cloud of the object based on the captured geometric characteristics of the object; generating a 3D polygon mesh of the object based on the generated 3D point cloud; transforming a first two-dimensional (2D) image to a second 2D image based on the generated 3D polygon mesh; and projecting the second 2D image onto the surface of the object.
42. A computer program product comprising a set of executable instructions, the set of executable instructions when executed by at least one processor causes the at least one processor to perform the method for surface profiling according to claim 40.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0006]
[0007]
[0008]
[0009]
[0010]
[0011]
[0012]
DETAILED DESCRIPTION OF THE DRAWINGS
[0013] The present invention is directed to correctly and accurately projecting, using a projector system, a two-dimensional image (e.g., a construction-related blueprint) onto an uneven work surface, such as corrugated steel sheets, so that all points of the image appear on the surface at their true positions with true scale. Moreover, the present invention is directed to updating the projected image when the projector system is moved to a new position.
[0014] In one aspect of the present invention, a three-dimensional (3D) profile of the uneven surface may be generated. By way of example, generation of the 3D profile may be implemented by a projector system using the one or more of the following components and/or approaches: (1) a laser scanner, (2) a time-of-flight (TOF) camera, (3) at least one stereoscopy camera based approach, and/or (4) one or more structured light approaches. The 3D profile that is output from the projector system may be a point cloud, a surface mesh, a surface profile, or any other suitable type of three-dimensional representation of the surface. In a further example, the use of one or more range meters may improve the accuracy and robustness of the point cloud.
[0015] In another aspect of the present invention, the 3D point cloud of the uneven surface generated by the projector system may be converted to into a virtual surface mesh, such as a polygon mesh. The mesh may be generated based on geometric processing of the surface and virtually reconstructed.
[0016] In yet another aspect of the present invention, a two-dimensional (2D) image, such as a blueprint associated with a construction task, may be transformed (e.g., using linear affine transformation) based on the generated polygon mesh of the uneven surface. Optionally, the transformation may also be based on the position and the orientation of the projector system, which is further described in U.S. application Ser. No. 15/638,815, filed on Jun. 30, 2017, the content of which is incorporated herein by reference in its entirety. In at least that regard, the points and/or lines of the blueprint appears at their true positions, despite the uneven characteristics of the projection surface. As such, the construction worker relying on information in the projected blueprint to carry out the construction task may trust that the points, lines, and other graphical representations are where they actually have to be located.
[0017] One of the numerous advantages of the present invention is that the true position of every image point on the uneven surface is accounted for, and thus, ensures that tasks such as the afore-mentioned construction task can be performed accurately and correctly. The invention relates to preserving accuracy (e.g., true position, true scale) of the various aspects of a projected image and not merely how the projected image may look to an observer. This may be achieved, for example, by calibrating all system components (e.g., in itself and to each other) based on known-design and/or all data (e.g., point cloud, range, mesh, original and transformed images) may be referenced to a common coordinate system.
[0018] The invention described herein may be implemented on and executed by one or more computing devices. For instance, the projector system may have computing capabilities, by way of example, one or more processors, central processing units (CPUs), etc. As will be further described below, the computing associated with surface profiling and projecting a transformed image according to aspect(s) of the present invention may be executed by computing hardware in the projector system itself. Alternatively, the processing may be performed by a separate portable computing device, such as a laptop, tablet computer, or any other suitable type of mobile computing device.
[0019]
[0020] The instructions 116 may be one or more sets of computer-executable instructions (e.g., software) that can be implemented by the processor 112. Data 115 may include various types of information (which can be retrieved, manipulated and/or stored by the processor 112), such as information captured from surface profiling equipment to generate a 3D profile, mesh data, one or more images to be projected, one or more transformed images, etc.
[0021] Interface 137 may be any component that allows interfacing with an operator or user. For example, interface 137 may be a device, port, or a connection that allows a user to communicate with the projector system 110, including but not limited to a touch-sensitive screen, microphone, camera, and may also include one or more input/output ports, such as a universal serial bus (USB) drive, various card readers, etc. The interface 137 may also include hardware and/or equipment for surface profiling, such as one or more sensors (e.g., image sensors, light sensors), one or more cameras, one or more projectors, one or more range meters, etc.
[0022] The projector system 110 may be configured to communicate with other computing devices via network 130. For example, the projector system 110 may communicate with other projector systems, and/or mobile computing devices (e.g., laptops, tablet computers, smartphones). The network 130 may be any type of network, such as LAN, WAN, Wi-Fi, Bluetooth, etc.
[0023] Although processing related to at least generating the 3D profile, surface profiling, and/or image transformation are carried out by the one or more processors 112 of the projector system 110, it may be understood that the processing may be performed by external computing devices and/or hardware, such as a mobile computing device, that may be communicating with the projector system 110 via the network 130.
[0024]
[0025] Before image transformation, however, the projector system 110 may take the image 206 to be projected and input it into block 210. Optionally, information on the position and/or the orientation of the projector system 110 may also be input into block 210 for further accuracy. The image 206 may be a two-dimensional image. At block 210, the image 206 to be projected may be mapped onto the polygon mesh and a new two-dimensional image (e.g., a transformed image) is generated for projection at block 214.
[0026]
[0027] In an example,
[0028] In another example,
[0029] By way of example,
[0030] According to aspects of the present invention, a 3D point cloud of an object may be generated using different techniques. For example, a technique based on structured light where a projector is used to project one or multiple light patterns onto the surface may be implemented. These light patterns may be captured by one or more cameras, such as the cameras 304 and 404 of
[0031] After generating the 3D point cloud, the 3D point cloud may be converted into a polygon mesh, as described above.
[0032] More specifically, during image transformation, a triangle may be located in a plan image 804 (e.g., ortho) of the image 802 and a corresponding triangle may be located in the 3D polygon mesh, as illustrated in
[0033]
[0034]
[0035] In image projection 910, for example, if a blueprint image were to be projected onto the corrugated steel sheet without proper image transformation, the point (which indicates where the construction worker needs to drill) would be projected slightly below where the construction worker actually needs to drill. In image projection 940, however, the blueprint image undergoes proper image transformation (such as the image transformation described above) to account for the uneven surface of the corrugated steel sheet, the point that indicates where the construction worker needs to drill is projected at its true position. In other words, as shown in
[0036] In embodiments according to aspects of the present invention, when the projector is moved from one position to a new position, the image transformation may be automatically updated based on newly acquired information on the geometric characteristics of the object at the new position. Then, the projector system may automatically update the projection of the updated-transformed image from the new position. In at least that regard, when the construction worker intentionally or accidently moves the projector system, or if the projector system is moved for other reasons, the projected image is constantly and/or automatically updated so that tasks associated with the image being projected may be performed without little to no interruption.
[0037]
[0038] In step 1010, the projector system 110, for instance, may capture the overall geometric characteristics of an object, including the surface of the object (whether the surface is even or uneven). As described above, the object may be a corrugated steel sheet and the geometric characteristics may be captured using a laser scanner, a range finding camera (e.g., a time-of-flight camera), stereoscopy (e.g., using two cameras) based approach, structured light approach, etc.
[0039] In step 1020, a 3D point cloud of the object may be generated using the obtained overall geometric characteristics in step 1010. Thereafter, in step 1030, the 3D point cloud may be used to generate a 3D polygon mesh of the object, which may be used to transform a 2D image to another 2D image. The 2D image being transformed may be a blueprint for performing a construction-related work task, such as drilling holes.
[0040] If the surface is uneven, the generated 3D polygon mesh in step 1030 will transform the 2D image into a new 2D image in step 1040 so that when the new 2D image is projected onto the surface of the object in step 1050, the image characteristics and corresponding information (e.g., as the exact drilling positions) will be projected on the surface at their correct and accurate locations with true scale.
[0041] Numerous advantageous of the present invention, include but are not limited to, accounting for the accurate and correct projection of every point, line, characteristic, etc., of an image on an uneven surface, especially if the image that is being projected is related to a task that requires accuracy and precision. In that regard, how the projected image looks to an observer is not the main concern of the present invention, but rather whether one or more points in an image is projected at its true position.
[0042] The foregoing invention has been set forth merely to illustrate the invention and is not intended to be limiting. Since modifications of the disclosed embodiments incorporating the spirit and substance of the invention may occur to persons skilled in the art, the invention should be construed to include everything within the scope of the appended claims and equivalents thereof. Although the present disclosure uses terminology and acronyms that may not be familiar to the layperson, those skilled in the art will be familiar with the terminology and acronyms used herein.