SYSTEM AND METHOD FOR GEOLOCATION OF AN OBJECT IN WATER
20230260148 · 2023-08-17
Assignee
Inventors
- Andrea SAIANI (Trambileno (TN), IT)
- Emanuele ROCCO (Trambileno (TN), IT)
- Nadir PAGNO (Soranzen (BL), IT)
- Isacco GOBBI (Casaleone (VR), IT)
- Donato D'ACUNTO (Trento, IT)
Cpc classification
G06T7/246
PHYSICS
G06T7/277
PHYSICS
International classification
G06V10/74
PHYSICS
Abstract
A system for geolocation of an object in water includes: first and second devices for immersion in, or to float on, water, the first device including a light source that emits a light beam; the second device includes a camera and a measuring device; and a processing unit, operatively connected to the camera, and configured to: determine a vertical distance between the first and second devices based on the depth of both devices, capture a 2D image of the first device via the camera, calculate the pixel position in the image of the light beam from the light source, calculate a position of the first device relative to the main reference frame based on the pixel position of the light beam, the orientation of the camera, a position of the second device relative to the main reference frame and the vertical distance.
Claims
1. System (100;101;102) for geolocation of an object in water, the system comprising: a first device (1) configured to be immersed in, or to float on, water, the first device (1) comprising a light source (2) apt to emit a light beam (3), a second device (4) configured to be immersed in, or to float on, water, the second device (4) comprising a camera (6) for taking 2D images and a measuring device (7) arranged to provide an orientation of the camera (6) relative to a main reference frame defined by three orthogonal axes (X, Y, Z), a processing unit (9) operatively connected to at least the camera (6), the processing unit (9) being configured to: determine a vertical distance (Δz) between the first device (1) and the second device (4) based on a depth in the water of both devices, capture a 2D image of the first device (1) through the camera (6), calculate the pixel position in the 2D image of light beam (3) emitted by the light source (2) of the first device (1), and calculate a position of the first device (1) relative to the main reference frame based on the pixel position of the light beam (3), the orientation of the camera (6), a position of the second device (4) relative to the main reference frame and the vertical distance (Δz) and/or to calculate a position of the second device (4) relative to the main reference frame based on the pixel position of the light beam (3), the orientation of the camera (6), a position of the first device (1) relative to the main reference frame and the vertical distance (Δz).
2. The system according to claim 1, wherein the position of the second device (4) or the first device (1) used for calculating the position of the first device (1) or the second device (4), respectively, is stored in a data storage or provided by a position device and wherein the depth of the first device (1) in the water is stored in the data storage or measured by a depth gauge (12) comprised in the first device (1) and the depth of the second device (4) in the water is stored in the data storage or measured by a depth gauge comprised in the second device (4).
3. The system according to claim 2, wherein the processing unit (9) is operatively connected to the data storage and/or the position device so as to obtain the position of the second device (4) or the first device (1) for calculating the position of the first device (1) or the second device (4), respectively, and wherein the processing unit (9) is operatively connected to the data storage and/or the depth gauge of the first device (1) so as to obtain the depth of the first device (1) in the water and to the data storage and/or the depth gauge of the second device (4) so as to obtain the depth of the second device (4) in the water for determining the vertical distance (Δz).
4. The system according to claim 3, wherein the position device comprises at least one of an absolute position sensor, a real-time kinematic (RTK) positioning system, a mobile-phone tracking, a real-time locating system based on radio, optical or ultrasonic technology, and a positioning system based on methods of underwater acoustic positioning, as USBL, LBL or SBL, wherein the first device (1) or the second device (4) is provided with the position device.
5. The system according to claim 1, wherein the first device (1) comprises a first control unit (13) connected to the light source (2) for modulation of the light beam (3) so that the light beam (3) transmits information about the position and/or depth of the first device (1) and wherein the second device (4) comprises an optical sensor (14) configured to detect the light beam (3), the processing unit (9) being connected to the optical sensor (14) for obtaining the position and/or depth of the first device (1) based on the light beam (3) detected by the optical sensor (14).
6. The system according to claim 1, wherein one of the first device (1) or the second device (4) has an acoustic emitter (15) for emitting an acoustic signal which represents the position and/or depth of the relevant device and the other of the first device and the second device has an acoustic receiver (16) for receiving the acoustic signal emitted by the acoustic emitter (15), the processing unit (9) being connected to the acoustic receiver (16) for obtaining the position and/or depth of the one of the first device (1) or the second device (4) based on the signal received by the acoustic receiver (16).
7. The system according to claim 1, wherein the first device (1) and the second device (4) are connected to each other by a marine communication cable (17) through which the first device transmits to the second device information on its depth and/or position and/or the second device transmits to the first device information on its depth and/or position.
8. The system according to claim 1, wherein the processing unit is configured to predict a next position of the light beam (3) in the 2D image captured y the camera (6) by performing a recursive filtering algorithm based on at least an actual position and previous positions of the light beam (3) in the 2D image.
9. The system according to claim 1, wherein the first device comprises at least two light sources configured to emit respective light beams, the distance between each couple of light sources being fixed, and wherein the processing unit (9) is configured to: calculate the pixel position in the 2D image of each light beam emitted by the at least two light sources of the first device, and calculate the position of each of the at least two light sources relative to the main reference frame based on the pixel position of the relevant light beam, the orientation of the camera, a position of the second device relative to the main reference frame and the vertical distance, and to determine the orientation of the first device relative to the second device based on the calculated positions of the at least two light sources, and/or calculate the position of each of the at least two light sources relative to the main reference frame based on the pixel position of the relevant light beam, an orientation of a rigid surface of the first device relative to the main reference frame, a position of the first device relative to the main reference frame and the vertical distance, and to determine the orientation of the second device relative to the first device based on the calculated positions of the at least two light sources.
10. Method for geolocation of an object in water, the method comprising: putting a first device (1) into water, wherein the first device comprising a light source (2) configured to emit a light beam (3), putting a second device (4) into the water, the second device comprising a camera (6) for taking images, emitting the light beam (3) by means of the light source (2), obtaining an orientation of the camera (6) relative to a main reference frame defined by three orthogonal axes (X, Y, Z), obtaining a depth of the first device (1) and the second device (4) in the water, determining a vertical distance (Δz) between the first device (1) and the second device (4) based on the depth thereof in the water, capturing a 2D image of the first device (1) via the camera (6), calculating the pixel position in the 2D image of the light beam (3) emitted by the light source (2) of the first device (1), and obtaining a position of the second device (4) relative to the main reference frame and calculating a position of the first device (1) relative to the main reference frame based on the pixel position of the light beam (3), the orientation of the camera (6), the position of the second device (4) relative to the main reference frame and the vertical distance (Δz), or obtaining a position of the first device (1) relative to the main reference frame and calculating a position of the second device (4) relative to the main reference frame based on the pixel position of the light beam (3), the orientation of the camera (6), the position of the first device (1) relative to the main reference frame and the vertical distance (Δz).
11. The method according to claim 10, wherein the position of the second device (4) or the first device (1) used for calculating the position of the first device (1) or the second device (4), respectively, is stored in a data storage or provided by a position device and wherein the depth of the first device in the water is stored in the data storage or measured by a depth gauge (12) comprised in the first device (1) and the depth of the second device (4) in the water is stored in the data storage or measured by a depth gauge comprised in the second device (4).
12. The method according to claim 11, wherein the position device comprises at least one of an absolute position sensor, a real-time kinematic (RTK) positioning system, a mobile-phone tracking, a real-time locating system based on radio, optical or ultrasonic technology, and a positioning system based on methods of underwater acoustic positioning, as USBL, LBL or SBL, wherein the first device (1) or the second device (4) is provided with the position device.
13. The method according to claim 10, wherein the step of obtaining the position and/or depth of the first device (1) comprises: modulating the emitted light beam (3) so that the light beam transmits information about the position and/or depth of the first device (1), detecting the light beam (3) by an optical sensor (14), and determining the position and/or depth of the first device (1) based on the light beam (3) detected by the optical sensor (14).
14. The method according to any one of claims 10-13, wherein the step of obtaining the position and/or depth of at least one of the first device (1) and the second device (4) in the water comprises: emitting an acoustic or electric signal which represents the position and/or depth of one between the first device (1) or the second device (4), receiving the acoustic or electric signal, determining the position and/or depth of the one between the first device (1) or the second device (4) based on the received acoustic or electric signal.
15. The method according to claim 10, further comprising: predicting a next position of the light beam (3) in the 2D image captured through by the camera (6) by performing a recursive filtering based on at least an actual position and previous positions of the light beam (3) in the 2D image.
16. The method according to claim 10, wherein the first device comprises at least two light sources configured to emit respective light beams, the distance between each couple of light sources being fixed, the method further comprising: calculating the pixel position in the 2D image of each light beam emitted by each light source of the first device, calculating the position of each of the at least two light sources relative to the main reference frame based on the pixel position of the relevant light beam, the orientation of the camera, a position of the second device relative to the main reference frame and the vertical distance, and determining the orientation of the first device relative to the second device based on the calculated positions of the at least two light sources, or calculating the position of each of the at least two light sources relative to the main reference frame based on the pixel position of the relevant light beam, an orientation of a rigid surface of the first device relative to the main reference frame, a position of the first device relative to the main reference frame and the vertical distance, and determining the orientation of the second device relative to the first device based on the calculated positions of the at least two light sources.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0210] Features and advantages of the invention will be better appreciated from the following detailed description of preferred embodiments thereof which are illustrated by way of non-limiting example with reference to the appended Figures, in which:
[0211]
[0212]
[0213]
[0214]
[0215]
[0216]
DESCRIPTION OF EMBODIMENTS OF THE INVENTION
[0217] With reference to
[0218] The second device 4 comprises a camera 6 for taking 2D images and a measuring device 7 arranged to provide an orientation of the camera 6 relative to a main reference frame defined by three orthogonal axes X, Y, Z.
[0219] The measuring device 7 comprises inertial sensors and a magnetic compass for providing the orientation imu.sub.cam of the camera 6 relative to the main reference frame.
[0220] As shown in
[0221] The second device 4 further comprises a GPS sensor 8 for measuring the position of the second device 4, in particular the position P.sub.cam of the camera 6, relative to the main reference frame.
[0222] The second device 4 comprises a processing unit 9 operatively connected to the camera 6. The processing unit 9 is configured to determine a vertical distance Δz between the first device 1 and the second device 4 based on the depth in the water of both devices. The distance Z.sub.l between the first device 1 and the second device 4 is the distance between the image plane 10 of the camera 6 and the light plane 11 of the light beam 3. The vertical distance Δz includes offsets of the camera 6 with the water surface and the light with a depth gauge 12 of the first device 1 apt to measure the depth of the first device 1 in the water.
[0223] For instance, the position of the camera 6 (in mm) relative to the main reference frame (X,Y,Z) is P.sub.cam (=(X.sub.cam, Y.sub.cam, 0)=(380,0,0), the orientation of the camera 6 relative to the main reference frame is imu.sub.cam=(pitch, roll, yaw)=(π, 0,0) and the vertical distance is Δz=−650 (mm). Δz is given by the difference between the depth of the light source, −650 mm, and the null depth of the camera.
[0224] The pitch is set to π since the camera 6 is down looking with respect to the vertical whereas the roll is 0.
[0225] Moreover, in this example the camera 6 has an intrinsic camera matrix C.sub.m and distortion parameters k.sub.1, k.sub.2, k.sub.3, p.sub.1, p.sub.2 defined as follows:
[0226] The processing unit 9 is configured to capture a 2D image of the light beam 3 through the camera 6 and to calculate the pixel position p=(p.sub.u,p.sub.v) in the 2D image of light beam 3 through a light beam detection, wherein p.sub.u,p.sub.v are the position in pixel of light beam 3 along u,v axes which define the image reference frame of the image plane 10.
[0227] In this example, the calculated pixel position is:
[0228] The processing unit 9 is also configured to calculate a position P of the first device 1 (light beam 3) relative to the main reference frame based on the pixel position p of the light beam, the orientation of the camera, the position P.sub.cam and the vertical distance Δz.
[0229] In particular, the calculation of the position P of the first device 1 entails a calculation of an adjusted pixel position p′=(p′u,p′v) by following equation:
[0230] and a distortion correction operation applied to the adjusted pixel position p′ to obtain an undistorted pixel position pu=(pu.sub.u, pu.sub.v) of the light beam 3.
[0231] The undistorted pixel position pu=(pu.sub.u, pu.sub.v) of the light beam 3 is obtained by solving the following equations:
pu.sub.u(1+r.sup.2k.sub.1+r.sup.4k.sub.2+r.sup.6k.sub.3)+[2p.sub.1pu.sub.upu.sub.v+p.sub.2(r.sup.2+2pu.sub.u.sup.2)]−p′.sub.u=0
pu.sub.v(1+r.sup.2k.sub.1+r.sup.4k.sub.2+r.sup.6k.sub.3)+[2p.sub.2pu.sub.upu.sub.v+p.sub.1(r.sup.2+2pu.sub.u.sup.2)]−p′.sub.v=0
and r=√{square root over ((pu.sub.u.sup.2+pu.sub.v.sup.2))}.
[0232] The undistorted pixel position is therefore:
[0233] In order to calculate the position of the first device relative to the main reference frame, the unrotated vector p.sub.uR can be obtained by following equation:
[0234] where R is a rotation matrix which represents the orientation of the camera relative to a main reference frame. The rotation matrix R could be given from successive Euler rotations given the Euler angles (pitch, roll and yaw) measured by the inertial sensors, and preferably in their supposed order, or directly estimated from the inertial sensors supplied quaternions. In a particular embodiment, R can be given by the following matrix:
[0235] where here C[ ] stands for the cos[ ] function, S[ ] for the sin[ ] function, ϕ=yaw, and ψ=roll and θ=pitch, whose rotations have been taken in this order.
[0236] Alternatively, R can be given by the following matrix:
[0237] where the q.sub.1, q.sub.2, q.sub.3, q.sub.4 are given by the joint operation of the camera magnetic compass (and/or GPS compass) and inertial measurement unit.
[0238] In the example, given imu.sub.cam=(pitch,roll,yaw)=(π, 0,0), where pitch=π since the camera is down looking, R becomes the:
[0239] And p.sub.uR=(−0.6688, −0.0260, −1).
[0240] In this example p.sub.uR is renormalized, preferably in mm, to obtain P.sub.R=(P.sub.R.sub.
P.sub.R−p.sub.uRα
[0241] In this example, α=Z.sub.L, where Z.sub.L is the (minimum) distance between the camera plane and the light plane. Preferably, the light plane is the plane parallel to the camera plane and passing through the point in space defined by the light source.
[0242] Z.sub.L is given by the
Z.sub.L=Δz Sec(ψ−ArcTan(pu.sub.u))Sec(θ−ArcTan(−pu.sub.v))/(√{square root over (1+pu.sub.u.sup.2)}√{square root over (1+pu.sub.v.sup.2)})
[0243] With Sec the secant function, ψ=roll and θ=pitch. Δz is given by the difference between the depth of the light source and the depth of the camera.
[0244] In this example, α=Z.sub.L=−Δz=650 and P.sub.R=(−434.747, −16.9138, −650), expressed in mm since Z.sub.L in mm.
[0245] The position P=(P.sub.X,P.sub.Y,P.sub.Z) of the first device relative to the main reference frame (X,Y,Z) can therefore be obtained by the translation equation:
P=P.sub.R+t
[0246] where t=(t.sub.X,t.sub.Y,t.sub.Z) is a translation vector which represents the position of the second device relative to the main reference frame (X,Y,Z) that is t=P.sub.cam=(X.sub.cam, Y.sub.cam, 0).
[0247] Since P.sub.cam=(380,0,0), therefore P=(P.sub.X,P.sub.Y,P.sub.z)=(−54.74, −16.913, −650).
[0248] The first device comprises a first control unit 13 connected to the light source 2 for modulation of light beam 3 so that the light beam transmits information about the depth of the first device 1.
[0249] The second device comprises an optical sensor 14 (a photodiode) apt to detect the light beam 3, the processing unit 9 being connected to the optical sensor 14 for obtaining the depth of the first device 1 based on the light beam detected by the optical sensor 14.
[0250]
[0251]
[0252] In addition, P.sub.0, Q and H are set as:
[0253] And J is a null 4×4 matrix.
[0254] Below, a table showing twenty example points, ten at the beginning of the series and ten at the end, of the observed undistorted light position o.sub.k=(pu.sub.u, pv.sub.u), the predicted state s′.sub.k and the estimated state s.sub.k which have been obtained by the Kalman filter according to the above-mentioned parameters.
TABLE-US-00002 k 0 1 2 3 4 o.sub.k 601, 348 600, 347 600, 347 599, 346 599, 346 s′.sub.k 0.0, 595.96, 599.28, 601.36, 601.04, 0.0, 0.0, 0.0 345.56, 24.0, 14.0 346.68, 32.0, 17.0 347.72, 34.0, 18.0 347.48, 26.0, 12.0 s.sub.k 595.0, 598.0, 600.0, 600.0, 600.0, 345.0, 24.0, 14.0 346.0, 32.0, 17.0 347.0, 34.0, 18.0 347.0, 26.0, 12.0 347.0, 20.0, 7.0 k 5 6 7 8 9 o.sub.k 598, 345 597, 345 597, 344 596, 343 596, 343 s′.sub.k 600.8, 600.52, 599.2, 598.04, 596.92, 347.28, 20.0, 7.0 346.04, 13.0, 1.0 345.96, 5.0, −1.0 344.8, 1.0, −5.0 343.68, −2.0, −8.0 s.sub.k 600.0, 599.0, 598.0, 597.0, 597.0, 347.0, 20.0, 7.0 346.0, 5.0, −1.0 345.0, 1.0, −5.0 344.0, −2.0, −8.0 343.0, −3.0, −9.0 k 730 731 732 733 734 o.sub.k 242, 182 241, 182 240, 183 239, 183 238, 184 s′.sub.k 242.56, 241.56, 240.56, 239.56, 238.56, 181.44, −11.0, 11.0 182.44, −11.0, 11.0 182.44, −11.0, 11.0 183.44, −11.0, 11.0 183.44, −11.0, 11.0 s.sub.k 242.0, 241.56, 240.0, 239.0, 238.0, 182.0, −11.0, 11.0 182.44, −11.0, 11.0 183.0, −11.0, 11.0 183.0, −11.0, 11.0 184.0, −11.0, 11.0 k 735 736 737 738 9 o.sub.k 237, 184 237, 184 235, 185 234, 185 234, 186 s′.sub.k 237.56, 236.56, 235.56, 234.56, 234.56, 184.44, −11.0, 11.0 184.44, −11.0, 11.0 185.44, −11.0, 11.0 185.44, −11.0, 11.0 185.44, −11.0, 11.0 s.sub.k 237.0, 236.0, 235.0, 234.0, 234.0, 184.0, −11.0, 11.0 185.0, −11.0, 11.0 185.0, −11.0, 11.0 185.0, −11.0, 11.0 186.0, −11.0, 11.0
[0255]
[0256] System 101 differs from system 100 described above by the first device 1 has an acoustic emitter 15 for emitting an acoustic signal which represents the depth of the first device 1. The second device 4 has an acoustic receiver 16 for receiving the acoustic signal emitted by the acoustic emitter 15.
[0257] The processing unit 9 is connected to the acoustic receiver 16 for obtaining the depth of the on first device based on the signal received by the acoustic receiver 16.
[0258]
[0259] System 102 differs from system 100 described above by the first device 1 and the second device 4 are connected to each other by a marine communication cable 17 through which the first device 1 transmits to the second device 4 information on its depth. The invention thereby solves the problem set out, at the same time achieving a number of advantages. In particular, the system for geolocation of an object in water according to the invention has a reduced architectural complexity compared to the known systems.