Abstract
An apparatus for modelling a shaft in 3D, includes a sensor head adapted to be axially moved within the shaft. The sensor head includes: image sensors placed along the circumference of the sensor head, adapted to take images along an inner circumference of the shaft; a measuring apparatus adapted to determine a measured height position of the sensor head within the shaft. A processing unit includes: a placement module configured to place the images in a virtual space, based on the measured height position and the positioning of the image sensors on the sensor head, a correction module configured to correct the placement, based on comparing overlapping images and/or based on a measured deviation of the sensor head with respect to a central axis of the shaft.
Claims
1.-14. (canceled)
15. An apparatus for modelling a shaft in 3D, comprising: a sensor head adapted to be axially moved within said shaft by means of a suspension system, said sensor head comprising: 3D sensors placed along the circumference of said sensor head, adapted to take images along an inner circumference of said shaft, each of said images comprising depth information about the part of the inner surface of said shaft within the field of view of the respective 3D sensor; a measuring apparatus adapted to determine a measured height position of said sensor head within said shaft; a processing unit comprising: a placement module configured to place said images in a virtual space, based on said measured height position and the positioning of said image sensors on said sensor head, resulting in a rough placement of said images, thereby obtaining a first reconstruction of the inner surface of said shaft; a correction module configured to correct said rough placement, based on comparing overlapping images and/or based on a measured deviation of said sensor head with respect to a central axis of said shaft, resulting in a 3D model of said shaft.
16. The apparatus according to claim 15, wherein said apparatus comprises a second measuring apparatus adapted to determine said measured deviation by reconstruction of the path followed by said sensor head.
17. The apparatus according to claim 16, wherein said placement module is configured to place said images in said virtual space based on said path, and wherein said correction module is configured to correct said rough placement based on comparing overlapping images.
18. The apparatus according to claim 17, wherein said second measuring apparatus is an inertia measuring unit, and/or comprises an accelerometer, and/or comprises a gyroscope, and/or comprises a magnetometer.
19. The apparatus according to claim 15, wherein said correction module is configured to correct the placement of said images on a circumferential position in said virtual space, based on comparing images overlapping in said rough placement in the height direction.
20. The apparatus according to claim 19, wherein said correction module is configured to correct said rough placement by minimizing the difference between point clouds.
21. The apparatus according to claim 15, wherein said 3D sensors use 3D imaging technology.
22. The apparatus according to claim 21, wherein said 3D sensors use a combination of stereovision technology and the projection of a structured light pattern.
23. The apparatus according to claim 15, wherein said 3D sensors are placed on said sensor head according to a same height position on said sensor head.
24. The apparatus according to claim 15, wherein said measuring apparatus is adapted to determine said measured height position based on 3D imaging technology.
25. The apparatus according to claim 15, wherein said processing unit forms a physical unit with said sensor head.
26. The apparatus according to claim 15, wherein said processing unit is adapted to determine dimensional parameters based on said placement of said images in said virtual space, said dimensional parameters being derived from said 3D model of said shaft.
27. A system for modelling a shaft in 3D, comprising: an apparatus according to claim 15; a suspension system adapted to move said sensor head axially within said shaft, comprising a mobile component chosen from the group of: a bar, a telescopic arm, or one or multiple cables, wherein said mobile component is adapted to be manually held in position or to be connected to a movable positioning system during said axial movement of said sensor head.
28. A method for modelling a shaft in 3D, comprising: moving a sensor head axially within said shaft by means of a suspension system; taking images by means of 3D sensors, wherein said 3D sensors are placed along the circumference of said sensor head and are adapted to take said images along an inner circumference of said shaft, and wherein each of said images comprises depth information about the part of the inner surface of said shaft within the field of view of the respective 3D sensor; determining a measured height position of said sensor head within said shaft by means of a measuring apparatus comprised within said sensor head; placing said images in a virtual space based on said measured height position and positioning said image sensors on said sensor head, resulting in a rough placement of said images, thereby obtaining a first reconstruction of the inner surface of said shaft; correcting said rough placement based on comparing overlapping images and/or based on a measured deviation of said sensor head with respect to a central axis of said shaft, resulting in a 3D model of said shaft.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0049] FIG. 1 shows an apparatus for modelling a shaft in 3D, according to an embodiment of the invention.
[0050] FIG. 2, including FIGS. 2a and 2b, and FIG. 3, including FIGS. 3a and 3b, illustrate a system for modelling a shaft in 3D, according to different embodiments of the invention, more specifically where various possible suspension systems are used.
[0051] FIG. 4, including FIGS. 4a and 4b, illustrates in a schematic manner a sensor head according to the invention, and the use of this sensor head in measuring a shaft. FIG. 4a and FIG. 4b give an axial cross section and a transverse cross section, respectively.
[0052] FIG. 5 gives a block diagram illustrating the data flows from a sensor head to a processing unit, and of the various modules in a processing unit, according to an embodiment of the invention.
[0053] FIG. 6, including FIGS. 6a to 6c, illustrates the descent of a sensor head into a shaft, and the possible occurrence of rotations and swaying herein.
[0054] FIG. 7, including FIGS. 7a to 7c, illustrates how a processing unit according to an embodiment of the invention takes rotations of the sensor head about its own axis and swaying of the sensor head while measuring the shaft into account.
[0055] FIG. 8 gives a block diagram illustrating the processing of rough data to dimensional parameters and various other possibilities in a user interface, according to an embodiment of the invention.
DETAILED DESCRIPTION OF THE EMBODIMENTS
[0056] FIG. 1 shows an apparatus for modelling a shaft 105 in 3D, according to an embodiment of the invention. In the application shown in FIG. 1, the shaft 105 is a manhole of a sewage system, or sewer shaft, extending from the ground level 106 to an underground sewage pipeline. The height of the shaft 105 is measured according to an axial direction 108. In the embodiment of FIG. 1, the shaft 105 has a cylindrical wall 107, having a cross section which is symmetrical axially, and which is constant over the height of the shaft 105. Typically, the vertical shaft 105 connects to a horizontal pipeline, where the wall 107 is thus interrupted. Other forms are however possible, such as a square or rectangular cross section of the shaft, or a cross section varying over the height of the shaft. Other applications are possible as well, where the shaft 105 is for example a chimney, pipeline, tank, pipeline shaft, elevator shaft, etc.
[0057] The apparatus comprises a sensor head 101 and a processing unit 500. The processing unit 500, not visible on FIG. 1, is located within the sensor head 101, or within a separate device, or is divided over the sensor head 101 and another device. The sensor head 101 is adapted to be moved within the shaft 105 by means of a suspension system 100, according to the axial direction 108. In the embodiment of FIG. 1 the suspension system 100 comprises a stand 103 and a cable or bar 102. FIG. 1 shows the setup during measuring the shaft 105, where the stand 103 is placed on the ground level 106, and the sensor head 101 is lowered into the shaft 105 by means of the bar or cable 102. After the measurement is finished, the sensor head 101 is detached and the stand 103 is closed, so that the pieces may be easily transported to another location. FIG. 1 shows that both the suspension system 100 as the sensor head 101 are compact and light, rendering the assembly easily movable and portable. In the embodiment of FIG. 1 the sensor head 101 has a weight of around 5 kg.
[0058] In the embodiment of FIG. 1 the system comprises also technology for geo-positioning and wireless communication. Geo-positioning, for example GPS localization, allows to register the geo location of the measurement automatically. Wireless communication, for example WiFi communication, radio communication or Bluetooth, allows to forward measured data wirelessly. In the embodiment of FIG. 1, the technology for geo-positioning and wireless communication is accommodated in a separate housing 104, which is mounted on the stand. However, other embodiments are possible, where this technology is for example accommodated in or on the sensor head 101.
[0059] FIG. 2 and FIG. 3 illustrate how the sensor head 101 may be combined with various suspension systems. In FIG. 2a, the suspension system comprises a stand 201 and a telescopic arm or extendable bar 200. In FIG. 2b, the sensor head 101 is mounted in a vehicle suspended on a system 203 at three suspension points. The figure shows three cables 202, but a different number of cable and/or suspension points is also possible. The suspension systems shown in FIG. 2 contribute to limiting the movement of the sensor head 101 during the descent into the shaft 105. However, FIG. 3 illustrates that the sensor head 101 may also be combined with a more free suspension, where swaying and rotations about the own axis of the sensor head 101 possibly occur. In FIG. 3a, the suspension system comprises a stand 301, where the sensor head 101 is suspended through a cable 300, for example a steal cable. In FIG. 3b, the sensor head 101 is attached to a cable 302 at one end, and the cable 302 is held by a person, see 303, in order to lower the sensor head 101 manually into the shaft 105. Such free choice of suspension allows for example to market a standard version, but also to realize solutions tailored to a user or operator. For example, if a customer already disposes over a suspension system for fall prevention when a person descends in a manhole, then this suspension system may easily be reused in combination with the sensor head 101. Also note that possible embodiments are not limited to the one presented in FIG. 2 and FIG. 3. For example, a cable system 203 in a vehicle may be combined with a single cable 300, or the sable 300 may be replaced by a bar.
[0060] FIG. 4 illustrates schematically the use of the sensor head 101 in measuring a shaft 105. The transverse cross section of FIG. 4b shows that the sensor head 101 comprises ten image sensors 400 or 3D sensors 400, placed equidistantly about the circumference of the sensor head 101. The axial cross section of FIG. 4a shows that the image sensors 400 are placed at a same height position on the sensor head 101. The image sensors 400 on FIG. 4 are presented schematically, without wishing to present a realistic shape or dimensions of the sensors. Other embodiments are also possible, for example where a different number of image sensors 400 is present, or their height position on the sensor head 101 differs between sensors.
[0061] FIG. 4 illustrates that each image sensor 400 has a certain viewing angle, according to the height position and according to the circumferential direction. FIG. 4a shows that under a viewing angle 401, an image sensor 400 has a view on a part of the inner surface of the shaft 105 over a height 402. Analogously, FIG. 4b shows that under a viewing angle 403, an image sensor 400 has a view on a part of the inner circumference of the shaft 105 over a tangential distance 404. Together, the image sensors 400 have a view of 360° on a complete inner circumference of the shaft 105, where individual images overlap partly according to the circumferential direction, as may be seen in FIG. 4b. The presence of ten sensors 400 over the circumference of the sensor head 101 allows that sensors 400 with a limited viewing angle 403 are applied, typically resulting in a higher resolution of the images.
[0062] Images are made by the image sensors 400 during a descent of the sensor head 101 into the shaft 105. In this way, a ring of images becomes available within the shaft 105 at different height positions of the sensor head 101. No rotation of the sensor head 101 is needed for this, so that the presence of moving parts is avoided. Typically, the time point at which the respective image is taken is also registered by an image sensor 400.
[0063] FIG. 4a shows that typically the ring of images made at a certain height position of the sensor head 101 overlaps with the ring of images made at a next height position. The axial movement of the sensor head 101 may be continuous, where images are taken at a certain time interval, or the movement of the sensor head 101 may be discontinuous, where the axial movement is stopped in between. In the latter case, image sensors 400 may be controlled sequentially, wherein at a given height position of the sensor head the image sensors 400 each in turn take their image and forward it. Such control may offer a solution in case a large data flow is not possible.
[0064] In an embodiment of the invention, the image sensors 400 are 3D sensors using a combination of stereovision technology and the projection of a structured light pattern. An 3D sensor 400 comprises herein two cameras, which take images under different angles, and a light source such as a laser, allowing to project a pattern. The 3D sensor also comprises a processor allowing to reconstruct the 3D image through image processing algorithms. A 3D sensor 400 is for example an Intel RealSense Depth Camera D415, or a similar technology. A 3D sensor 400 may also comprise an RGB sensor, to collect color information. The combination of stereovision and structured light allows to obtain an accurate and qualitative modelling in all light conditions. However, other embodiments are also possible, where a different type of 3D sensor is used. Optionally, the sensor head 101 may also contain one or more light sources, to illuminate the wall of the shaft 105 while taking images.
[0065] The sensor head 101 comprises furthermore a measuring apparatus 501 adapted to determine a measured height position 507 of the sensor head 101 within the shaft 105, as presented schematically in FIG. 5. In the embodiment of FIG. 1, the measuring apparatus 501 is mounted at the bottom of the sensor head 101, so that it is directed towards the bottom of the shaft 105 while measuring. The measuring apparatus 501 is for example a one-dimensional laser, or a 3D image sensor having a processor to derive the measured height position from the image of the bottom. In another embodiment, the measuring apparatus 501 may be mounted at the top op the sensor head 101, being directed towards the top of the shaft 105 while measuring. In FIG. 6, different measured height positions 600, 601, 602 are depicted schematically. Typically, the measuring apparatus 501 also registers the time at which the respective height position 600, 601, 602 is measured. Optionally, in addition to the measuring apparatus 501, a detector may also be present on the sensor head 101, which stops the descent when the sensor head 101 has almost reached the bottom. If necessary, the recording of images and measurements may then be stopped automatically.
[0066] In the embodiment of FIG. 5, the sensor head 101 also comprises a second measuring apparatus 502, adapted to determine a measured deviation 508 of the sensor head 101 with respect to a central axis 603 of the shaft 105. The second measuring apparatus 502 is for example an inertia measure unit or Inertial Measurement Unit, IMU, allowing to measure rotations and accelerations. The IMU allows for example to reconstruct the path followed by the sensor head 101, and the measured deviation 508 may be derived from this. In an embodiment, the measuring apparatus 501 and the second measuring apparatus 502 may be combined in a single device, the latter device being adapted to track the position of the sensor head 101 in x, y, and z direction.
[0067] FIG. 5 shows that the images 506 taken by the 3D sensors 400, the measured height position 507 resulting from the measuring apparatus 501, and the measured deviation 508 resulting from the second measuring apparatus 502, are transferred to a processing unit 500. The times corresponding to the respective measurements are transferred to the processing unit 500 as well. An API is for example used to read the information coming from the image sensors 400 and the measuring apparatus. The processing unit 500 also disposes over information regarding the circumferential position and height position of the image sensors 400 on the sensor head 101. The exact position of the image sensors 400 on the sensor head 101 is for example determined by means of a calibration, prior to measuring the shaft 105. An automated calibration tool may for example be used, where the sensor head 101 takes images in a test environment provided with reference points, and the exact position of each sensor 400 on the sensor head 101 is determined from this.
[0068] The processing unit 500 in FIG. 5 comprises a placement module 503, a correction module 504 and a visualization module 505. For example, a first placement of the images in the virtual space is performed by the placement module 503 by means of the images 506, the measured height position 507 and the information coming from the IMU 502, resulting in a rough placement 509. Here, images taken at a certain time are placed at the measured height position corresponding to the same time, and the path measured by the IMU is used to take swaying and rotation about the own axis of the sensor head 101 into account. The first placement is thus based on tracking the sensor head 101. Next, a correction is performed on the rough placement 509 by means of the correction module 504. An Iterative Closest Point, ICP, algorithm is for example used to compare overlapping pieces in the images, and to in this way refine the placement or correct for errors or inaccuracies. This results in a corrected placement 510 of the images in the virtual space. The 3D model 510 of the shaft, corresponding to the corrected placement 510, is then visualized for a user in the visualization module 505. By using measured RGB information, color values may be added to the 3D model 510 as well.
[0069] In another embodiment, the placement module 503 uses the measured images 506 and the measured height position 507 to perform a rough placement 509 of the images in the virtual space, taking into account the position of the image sensors 400 on the sensor head 101. The rough placement 509 is then corrected by the correction module 504 in two ways. On the one hand, the placement 509 is corrected for swaying of the sensor head, by means of the measured deviation 508, derived from the path measured by the IMU. On the other hand, the rough placement 509 is corrected for rotations of the sensor head about its own axis, by means of an ICP algorithm. The ICP algorithm compares here parts of images overlapping over the height, and searches for similar characteristics in the overlapping parts regarding color and/or depth information. The circumferential position of the images is then corrected, so that corresponding characteristics coincide as closely as possible. Both corrections, of which the order may differ, result in a corrected placement 510 of the images in the virtual space, which may be visualized in a visualization module 505.
[0070] Other embodiments are possible, e.g. wherein no IMU 502 is present and corrections are purely based on an ICP algorithm, or wherein an IMU 502 is present in the sensor head 101, while no ICP-based corrections are done.
[0071] FIGS. 6 and 7 illustrate furthermore a possible action of the placement module 503 and correction module 504, and how these take into account swaying of the sensor head 101 and rotations about its own axis. FIG. 6a illustrate a descent of the sensor head 101, with successive height positions 600, 601 and 602, where no swaying or rotations occur. In FIG. 6b a rotation of the sensor head 101 about its own axis 603 occurs, but no swaying. Swaying means that a deviation 606 occurs with respect to a vertical axis 603. FIG. 6c illustrates how the processing unit 500 assumes a measured height position 605 when the occurring swaying would not be taken into account, while the image taken by the respective sensor 400 is actually located at a height position 604. This would result in an inaccurate placement of the images in the virtual space.
[0072] FIG. 7 illustrates how the placement of the images may be corrected for rotations and swaying. No swaying or rotation of the sensor head 101 occur in the situation of FIG. 7a during the descent. A ring of images is measured at three different height positions, see the rings 700, 701 and 702 at the left in FIG. 7a. FIG. 7a illustrates the placement of the images in the virtual space, where for the sake of simplicity of representation the circumferential direction is projected on the horizontal direction. It suffices in FIG. 7a to place the measured rings 700, 701 and 702 in the virtual space only taking into account the measured height positions.
[0073] FIG. 7b illustrates a situation where swaying of the sensor head 101 occurs during the descent of the sensor head 101. Among the measured ring of images 703, 704, 705, the ring 704 shows a tilt as a result of swaying of the sensor head 101 at the place where the ring 704 was measured. When in this situation the rings 703, 704, 705 are placed in the virtual space only taking into account the measured height position, this results in an inaccurate placement. FIG. 7b, at the right, illustrates how, to obtain an accurate placement, the image should be tilted in the virtual space as well. This may be done by taking into account a measured deviation of the sensor head 606, which may for example be derived from the path measured by an IMU.
[0074] FIG. 7c illustrates a situation where a rotation of the sensor head 101 about its own axis and swaying of the sensor head 101 occur during the descent. The ring of images 706 was taken at a height position where the sensor head 101 was twisted. The ring of images 707 was taken at a height position where the sensor head 101 was tilted. The ring of images 708 was taken at a height position where the sensor head 101 was neither twisted nor tilted. Analogous with FIG. 7b, FIG. 7c illustrates, at the right, how the image 707 should be placed in the virtual space in a tilted fashion in order to take into account swaying of the sensor head 101. On the other hand, the image 706 should be rotated in the virtual space with respect to the other images 707, 708 in order to take into account the rotation of the sensor head 101. In the projected representation of FIG. 7b, at the right, it may be seen as a shift of the image 706 with respect to the other images 707, 708. The rotated placement of the image 706 in the virtual space may for example occur based on rotation information measured by an IMU, or by means of an ICP algorithm which compares images partially overlapping in the height direction.
[0075] FIG. 8 illustrates further processing possibilities once the 3D model of the shaft 105 has been determined in the virtual space. FIG. 8, the starting point is rough data 800, which is available in the form of a point cloud, if necessary supplemented with color information. Optionally, the 3D images are filtered, to reduce the noise present. A parametrization 801 is then performed based on the rough data 800, allowing to derive dimensional parameters from the 3D model. A user interface allows for example to make cuts in the model, and distances and/or areas are calculated automatically for this. Certain shapes may also be recognized automatically, such as the presence of a pipeline connecting to the shaft. Optionally, image recognition algorithms may be used to derive information regarding the structural condition of the shaft based on colors present in the model. Certain defects, such as a missing stone, leak, water seepage, etc., may be recognized automatically in the 3D model.
[0076] Finally, FIG. 8 illustrates how information present in or obtained from the 3D model may be made available for other applications 802, for example through an API 803. An application 802 is for example a database assessing the manholes of the sewage system, or a GIS viewer (Geographic Information System) where the geographical location of manholes is represented visually.
[0077] Although the present invention was illustrated by means of specific embodiments, it will be clear for the person skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the present invention may be executed with different modifications and adaptations without departing from the field of application of the invention. The present embodiments should therefore in all respects be considered as illustrative and not restrictive, wherein the field of application of the invention is described by attached claims and not by the foregoing description, and all modifications which fall within the meaning and scope of the claims are therefore included. In other words, it is understood to include all modifications, variations or equivalents falling within the are of application of the underlying basic principles and of which the essential attributes are claimed in this patent application. Moreover, the reader of this patent application will understand that the words “comprising” or “to comprise” do not exclude other elements or other steps, and that the word “a(n)” does not exclude plural. Possible references in the claims may not be understood as a limitation of the respective claims. The terms “first”, “second”, “third”, “a”, “b”, “c” and the like, when used in the description or in the claims, are used to distinguish between similar elements or steps and do not necessarily describe a successive or chronological order. The terms “top”, “bottom”, “over”, “under” and the like are used in the same way with respect to the description and do not refer necessarily to relative positions. It should be understood that these terms are mutually interchangeable under the right conditions and the embodiments of the invention are able to function according to the present invention in other orders or orientations than those described or illustrated in the above.