METHOD AND DEVICE FOR ESTIMATING MECHANICAL PROPERTY OF ROCK JOINT
20220128729 · 2022-04-28
Inventors
Cpc classification
G06V10/14
PHYSICS
G01B11/26
PHYSICS
G01B2210/52
PHYSICS
G01C11/02
PHYSICS
International classification
G01B11/26
PHYSICS
G01C11/02
PHYSICS
Abstract
A method and devices for estimating a mechanical property of a rock joint are presented. The method comprises obtaining a plurality of at least partly overlapping photos of the rock joint by at least one optical sensor comprised in a device, wherein said photos represents the rock joint from different perspectives or positions, generating a digital three-dimensional representation of the rock joint based on said plurality of photos, and determining the mechanical property of the rock joint based on the generated digital three-dimensional representation
Claims
1. A method for estimating a mechanical property of a rock joint, the method comprising: obtaining a plurality of at least partly overlapping photos of the rock joint by at least one optical sensor comprised in a device, said photos representing the rock joint from different perspectives or positions, generating a digital three-dimensional representation of the rock joint based on said plurality of photos, and determining the mechanical property of the rock joint based on the generated digital three-dimensional representation.
2. The method according to claim 1, wherein the mechanical property is roughness, shear strength or friction angle, or any combination thereof.
3. The method according to claim 1, wherein a number of said plurality of photos of the rock joint is at least 20 or at least 40 photos, or in the range of 20-40 photos.
4. The method according to claim 1, wherein the generation of the digital three-dimensional representation includes determining a point cloud.
5. The method according to claim 1, wherein said plurality of photos represent the rock joint from different angles with respect to the rock joint.
6. The method according to claim 1, wherein the device comprises a distance sensor, such as an infrared distance sensor, and the method comprises providing instructions based on measurement data of the distance sensor to an operator via a display of the device for obtaining the plurality of at least partly overlapping photos.
7. The method according to claim 1, comprising moving the at least one optical sensor with respect to the rock joint while the rock joint remains still in its position.
8. The method according to claim 7, wherein said at least one optical sensor is comprised in a handheld device, and the method comprises moving the handheld device.
9. The method according to claim 7, wherein said at least one optical sensor is comprised in an unmanned aerial vehicle, such as a quadcopter, and the method comprises moving the unmanned aerial vehicle.
10. The method according to claim 1, comprising moving the rock joint with respect to said at least one optical sensor while said at least one optical sensor remains still in its position.
11. The method according to claim 10, wherein the rock joint is included in a sample arranged on a rotatable platform, and the method comprises moving the rock joint by the rotatable platform.
12. The method according to claim 1, comprising determining overlapping parts of said plurality of photos, and generating the digital three-dimensional representation of the rock joint based on the overlapping parts.
13. The method according to claim 4, comprising removing unnecessary points of the point cloud, such as points outside a portion comprising the rock joint.
14. The method according to claim 13, wherein the obtaining a plurality of at least partially overlapping photos comprises arranging the optical sensor such that there are 1-120 points, preferably 30-90, most preferably 45-65 points per square millimetre of the rock joint, such as with respect to a sample including the rock joint, in said plurality of photos.
15. The method according to claim 4, comprising performing a triangulation of the point cloud.
16. A device for estimating a mechanical property of a rock joint, wherein the device comprises at least one optical sensor for obtaining photos of the rock joint, and a control unit in connection with the at least one optical sensor, wherein the control unit comprises a processing unit configured to generate a digital three-dimensional representation of the rock joint based on plurality of at least partly overlapping photos of the rock joint, and determine the mechanical property of the rock joint based on the generated digital three-dimensional representation.
17. The device according to claim 16, comprising a distance sensor, such as an infrared distance sensor, wherein the processing unit is configured to provide instructions based on measurement data of the distance sensor to an operator via a display of the device for obtaining the plurality of at least partly overlapping photos.
18. A handheld device comprising the device according to claim 16.
19. An unmanned aerial vehicle, such as a quadcopter, comprising the device according to claim 16.
20. A combination of a rotatable platform and the device according to claim 16.
Description
BRIEF DESCRIPTION OF FIGURES
[0040] Some embodiments of the invention are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings.
[0041]
[0042]
[0043]
[0044]
[0045]
[0046]
[0047]
[0048]
DETAILED DESCRIPTION OF SOME EMBODIMENTS
[0049]
[0050] In some embodiments, there may be only one optical sensor 12 or a plurality of optical sensors 12. The plurality of sensors 12 may be arranged with a certain distance, such as 5 or 10 millimetres, from each other.
[0051] According to various embodiments, the device 10 may comprise a distance sensor 17, such as an infrared distance sensor. A line of sight related to the distance sensor 17 is marked with reference number 17L.
[0052] In embodiments comprising the distance sensor 17, the processing unit may be configured to provide instructions based on measurement data of the distance sensor 17 to an operator via a display 15 of the device 10 for obtaining the plurality of at least partly overlapping photos.
[0053] In addition, the device 10 may comprise communication unit 16 for communicating with an external device or system with respect to the device 10. The communication unit 16 may be based on wired or wireless technology. The communication unit 16 may be used to provide a short-range communication connection, such as via BlueTooth™, or an ethernet connection, such as wirelessly or in wired manner.
[0054] In addition, the device 10 may comprise electrical power source 18 for providing electrical power to operate the device 10. The electrical power source 18 may be a battery or an electrical connector for connecting an outside power source, such as connection to an electrical grid or a designated electrical power providing device.
[0055] Furthermore, the device 10 may comprise a housing 19. In various embodiments, such as described hereinabove with respect to
[0056]
[0057] Step 100 may refer to a start-up phase of the method. Suitable equipment and components may be obtained, and systems assembled and configured for operation. These may include obtaining or manufacturing a device in accordance with some embodiment of the present invention. Furthermore, necessary communication and/or electrical connections may need to be provided for the systems to operate correctly. Still further, the optical sensor 12, such as the camera, may need to adjusted, for example, with respect to shutter speed and aperture settings.
[0058] Step 110 may refer to obtaining a plurality of at least partly overlapping photos of the rock joint by at least one optical sensor 12 comprised in a device 10, wherein said photos represents the rock joint from different perspectives or positions.
[0059] The mechanical property may be, for example, roughness, shear strength or friction angle, or any combination thereof.
[0060] In some embodiments, a number of said plurality of photos of the rock joint may be at least 20 or at least 40 photos. Alternatively, the number of said plurality of photos may be in the range of 20-40 photos. However, the number of said plurality of photos may be at least 100 which gives even more accurate data for the determination of the mechanical property of the rock joint.
[0061] According to an embodiment, the obtaining may comprise arranging the optical sensor 12 such that a first distance between two neighbouring pixels represents a second distance of at most 0.05 centimetres in the rock joint, or in the sample or surface to be studied, in order to achieve submillimetre accuracy.
[0062] In some embodiments, the first distance may be estimated based on distance between the optical sensor and the rock joint, focal length, sensor height and/or sensor width, and image height and/or image width, respectively.
[0063] Alternatively or in addition, said plurality of photos may represent the rock joint from different angles with respect to the rock joint, such as 10, 20, 45, 60, 75 or 90 degrees.
[0064] In various embodiments, the method may comprise moving the optical sensor 12 with respect to the rock joint while the rock joint remains still in its position for obtaining the plurality of photos. In some embodiments, said at least one optical sensor 12 may be comprised in a handheld device, wherein the method comprises moving the handheld device. In another embodiments, said at least one optical sensor 12 may be comprised in an unmanned aerial vehicle, such as a quadcopter, wherein the method comprises moving the unmanned aerial vehicle.
[0065] Alternatively, the method may comprise moving the rock joint with respect to said at least one optical sensor 12 while said at least one optical sensor 12 remains still in its position. In some embodiments, the rock joint may be included in a sample arranged on a rotatable platform, wherein the method then comprises moving the rock joint, that is by moving the sample, by the rotatable platform.
[0066] In some embodiments, the method may comprise determining overlapping parts of said plurality of photos, and then generating the digital three-dimensional representation of the rock joint based on the overlapping parts.
[0067] In various embodiments, the photos may be transmitted to outside the device 10 for processing, such as to a cloud service or an external server. In some embodiments, the results of the processing in the cloud service or the external server may be transmitted back to the device or used as such.
[0068] Step 120 may refer to generating a digital three-dimensional representation of the rock joint based on said plurality of photos, optionally, by a control unit 14 of the device 10. Alternatively, step 120 may be performed in the cloud service or in the external server.
[0069] In a preferable embodiment, the generation of the digital three-dimensional representation may include determining a point cloud. In various embodiments, the method may comprise removing unnecessary points of the point cloud, such as points outside a portion comprising the rock joint. Still further, with or without said removing of unnecessary parts, the method may comprise performing a triangulation of the point cloud.
[0070] According to various embodiments, the obtaining may comprise arranging the optical sensor 12 such that there are 1-120 points, preferably 30-90, most preferably 45-65 points per square millimetre of the rock joint in said plurality of photos in order to achieve submillimetre accuracy. In various embodiments, an accuracy of 0.5 millimetres may be achieved to allow high accuracy joint roughness replication.
[0071] Step 130 may refer to determining the mechanical property of the rock joint based on the generated digital three-dimensional representation.
[0072] Method execution may be stopped at step 199. The method may be performed once, on demand, continuously, sequentially, or intermittently, for instance. The target rock joint or target surface may be changed between two instances of method execution, for instance.
[0073] In an embodiment, the device may comprise a distance sensor 17, such as an infrared distance sensor, and the method may then comprise providing instructions based on measurement data of the distance sensor 17 to an operator via a display 15 of the device 10 for obtaining the plurality of at least partly overlapping photos.
[0074] The accuracy of the result may be controlled by adjusting various parameters, such as related to the device 10, the environmental conditions, or to the procedure itself.
[0075] Some examples of the device 10 related parameters are as follows: resolution of the optical sensor/camera 12 (such as sensor pixel size), type of the lens, optical sensor 12 related settings (ISO speed or setting, aperture, shutter speed).
[0076] Some examples of the environmental conditions related parameters are as follows: the size of the measured object including the rock joint 5, lighting conditions.
[0077] Some examples of the procedure related parameters are as follows: distance from camera to the rock joint 5, number of obtained photos, camera intersection angle.
[0078] In various embodiments, the control of accuracy may be performed by adjusting at least the optical sensor 12 resolution and the distance between the optical sensor 12 and the rock joint 5.
[0079] In various embodiments, a plurality of photos of at least partly overlapping content may be utilized to create or generate 3D point clouds of the rock joint 5, preferably including submillimeter accuracy. In these embodiments, a high-resolution series of photos of convergent and overlapping optical sensor views may be utilized as input.
[0080] In various embodiments, a wall sampling distance may, advantageously, be arranged to equal to the inverse of the Nyquist sample frequency for the wall sampling grid. In some embodiments, the wall sample distance of the plurality of photos may be at most 0.05 centimeters per pixel to achieve the submillimeter accuracy.
[0081] In some embodiments, the value of the wall sampling distance may be estimated by equation: first distance=MAX(“distance between the optical sensor and the rock joint” *“sensor height”/(“focal length” *“image height”), “distance between the optical sensor and the rock joint” *“sensor width”/(“focal length” *“image width”)).
[0082] In various embodiments, a step of pre-processing may be performed to the plurality of obtained at least partly overlapping photos. In the pre-processing step, the photos may be processed, such as cut, so that that all resulting portions of the photos, such as cuts, include substantially only the overlapping portions of the photos, preferably including (the area or portion of the surface comprising) the rock joint 5. Then variance of Laplacians (LAPVs) of the photo binary data may be calculated to estimate the blurriness of the photo cuts. For the calculation, the numerical grey scale values of the photo, i.e., the value component of the photo pixels in the HSV (hue, saturation, value) color model, may be used. Preferably, the processing may be done for cuts of the original set of plurality of photos to select only the parts of the photos content that are in focus.
[0083] In some embodiments, The LAPV values may compared with pre-determined values based on long time collection of photo samples. If a photo sample has a LAPV value that is less than the mean value minus two times the standard deviation of the collection, the photo may be removed. The selection criterion may be based on the expectation that the LAPV values follow the Standard Normal Distribution. Therefore, within two standard deviations of the mean accounts for approximately 95% values. However, an additional criteria may be that the photo may be removed only of there is some other photo covering the particular area of the rock joint 5.
[0084] The overlap of the photos may be evaluated using the XY-coordinates of the scale-invariant feature transform (SIFT) features found in the photos for determining points that are shared in many photos. This may be estimated in low resolution and with a limited number of calculated features. To obtain better performance, the photos that cover the same area of the rock joint 5 more than a limited time may be removed.
[0085] In an embodiment, alternatively or in addition, to obtain even better performance, the photo sets that cover the same area of the rock joint 5 more than m+x times, where m is the minimum number of photos to gain the relative 3D positioning of the photo shot rock surface points, and x is the extra number of photos after the positioning is not essentially improved, can be reduced. The extra photos can be removed randomly or using the LAPV quality estimate of the photos.
[0086] In some embodiments, the SIFT detection in the photos may be calculated using the graphics processing unit(s) (GPU(s)) comprised in the control unit of the device 10 or on an external computing system on which the obtained photos may be transmitted. For example, a distributed calculation may be performed by associating photos for each GPU so that the division is in proportion to the computing power of the GPUs.
[0087] In various embodiments, full image matching may be utilized in order to find matching features in photos. The full list of combinations of all photos may then be generated. In this stage, pairs of photos which are known to be far of each other can be filtered out. In some embodiments, for example, photos, that have a location information, such as a GPS (Global Positioning System) tag or otherwise provided, that shows distance more than a set value, for example 0.5 metres or 10 degrees or so, between the photo-taking locations or distances in steps in the sequential photo-taking. The resulting list may be divided and sent to the calculating host together with the associated SIFT values and photos.
[0088] In some embodiments, a technique for 3D reconstruction using structure from motion may be used to compute the matches. The result may then be written to digital files. When the processing of the matching sets has been performed, the results may be combined, and the estimates of the original optical sensor 12 locations calculated. Thus, a sparse point cloud of the model may be generated. After generating the sparse point cloud, a dense point cloud may be generated, for example, in Polygon File Format.
[0089] In some embodiments, noise filtering may be performed on the clouds to reduce the level of floating points. The noise filtering may be performed to remove the points far from the neighbors based on the standard deviation of the distance.
[0090] In an embodiment, if the number of the plurality of photos is high, such as over 100 or even over 1000, and there are no cues of their proximity of the photo locations, a copy of the photo set can be scaled to 2-5% of the original size and use the above SIFT and matching for the scaled down data. Then the resulting match data may then be used to create matching par lists for the full-scale photos.
[0091]
[0092] In
[0093]
[0094]
[0095]
[0096]
[0097]
[0098] The control unit 14 may comprise one or more processors 804, one or more memories 806 being volatile or non-volatile for storing portions of computer program code 807A-807N and any data values and possibly one or more user interface units 810. The mentioned elements may be communicatively coupled to each other with e.g. an internal bus.
[0099] The processor 804 of the control unit 14 may be at least configured to implement at least some method steps as described hereinbefore, such as in connection with
[0100] The specific examples provided in the description given above should not be construed as limiting the applicability and/or the interpretation of the appended claims. Lists and groups of examples provided in the description given above are not exhaustive unless otherwise explicitly stated.