Methods for geospatial positioning and portable positioning devices thereof

11614546 · 2023-03-28

Assignee

Inventors

Cpc classification

International classification

Abstract

Embodiments provide for a method of determining a geospatial position of a point of interest and a portable positioning device. In one embodiment, the method includes collecting data from a receiving unit and data from at least one of an imaging device and an IMU of the positioning device for each one of a plurality of positions of the positioning device. The collected data is then transmitted to a data fusing processor for determining orientations and positions of the positioning device for the plurality of positions in a global coordinate system. Further, the method includes obtaining a pointing input including a sighting direction towards the point of interest from the positioning device being positioned at at least one reference position. The pointing input is transmitted to the data fusing processor for identifying the point of interest and for determining the geospatial position of the point of interest in the global coordinate system.

Claims

1. A Method for determining a geospatial position of a point of interest using a positioning device, said method comprising: collecting, by a data collector of the positioning device, data from a global navigation satellite system (GNSS) receiving unit of said positioning device and images from an imaging device of said positioning device for each one of a plurality of different locations of said positioning device above said point of interest, the data from the GNSS receiving unit obtained at an antenna of the GNSS receiving unit, and the antenna and the imaging device having a known spatial relationship within the positioning device when capturing the images at the plurality of different locations; transmitting, to a data fusing processor, said collected data for determining orientations and positions of said positioning device in a global coordinate system (X.sub.1, Y.sub.1, Z.sub.1) for at least three locations of said plurality of different locations of said positioning device; obtaining, by said data collector, from a laser-based pointing device of the positioning device, a pointing input including a sighting direction towards said point of interest from said positioning device and a distance between said point of interest and said positioning device for at least one location of said plurality of different locations; determining a position and orientation of the positioning device at said at least one location based on the orientations and positions of the positioning device as determined at said at least three locations; and transmitting said pointing input to the data fusing processor for identifying said point of interest and for determining said geospatial position of said point of interest in said global coordinate system based on said pointing input and the determined orientation and position of the positioning device in the global coordinate system at said at least one location.

2. The method of claim 1, wherein said pointing input includes at least two different sighting directions from said positioning device towards said point of interest.

3. The method of claim 1, wherein said collected data includes GNSS data received at the GNSS receiving unit and gyroscopic and acceleration data received at an inertial measurement unit (IMU) for said plurality of different locations.

4. The method of claim 3, wherein said collected data is processed simultaneously by the data fusing processor for estimating the orientations and positions of said positioning device in said global coordinate system for said plurality of different locations of said positioning device.

5. The method of claim 3, wherein said collected data further includes the images captured by said imaging device for said plurality of different locations.

6. The method of claim 1, wherein said collected data includes GNSS data received at the GNSS receiving unit and the images captured by said imaging device for said plurality of different locations.

7. The method of claim 6, wherein determining said orientations and positions of said positioning device for said plurality of different locations of said positioning device in the global coordinate system includes: orientating, with respect to each other, a series of images captured with said imaging device and generating a 3D reconstruction of a scene using the orientated captured images; obtaining positions of the antenna of said GNSS receiving unit in the global coordinate system for at least a subset of the captured images based on satellite information signals received at the GNSS antenna; defining a local coordinate system (X.sub.2, Y.sub.2, Z.sub.2) and determining positions of the imaging device for at least some images of said subset in said local coordinate system; and determining a transformation function correlating a position of a point in the global coordinate system with a position of a point in the local coordinate system based on the known spatial relationship of the GNSS antenna and said imaging device within said positioning device, the determined positions of the antenna in the global coordinate system and the corresponding positions of the imaging device in the local coordinate system for said at least some images of said subset.

8. The method of claim 6, wherein said collected data further includes acceleration and gyroscopic data received from an inertial measurement unit (IMU) for determining orientations of said positioning device for said plurality of different locations of said positioning device and/or wherein the captured images are orientated based on photogrammetry.

9. The method of claim 7, wherein determining the geospatial position of the point of interest in the global coordinate system includes determining the position of the point of interest in the local coordinate system and determining the geospatial position of the point of interest in the global coordinate system based on the determined position of the point of interest in the local coordinate system and the determined transformation function.

10. The method of claim 1, wherein said data fusing processor is an integrated part of said positioning device or wherein said data fusing processor is arranged at a remote sever or device.

11. A portable positioning device adapted to obtain a geospatial position of a point of interest, said portable positioning device comprising: a global navigation satellite system (GNSS) receiving unit including an antenna adapted to receive satellite information signals from a GNSS; either one, or both of, an imaging device or an inertial measurement unit (IMU) wherein said imaging device is adapted to capture a series of images, or a video, of a scene including said point of interest, and wherein said IMU is adapted to provide acceleration and gyroscopic data wherein a spatial position of the antenna relative to the imaging device when capturing the series of images or the video is known and/or wherein a spatial position of the antenna relative to the IMU is known; a laser-based pointing device; a data collector adapted to collect data from said GNSS receiving unit and data from at least one of said imaging device or said IMU for each one of a plurality of different locations of said positioning device above said point of interest, wherein said data collector is further configured to obtain a pointing input, from the laser-based pointing device, wherein said pointing input includes a sighting direction towards said point of interest from said positioning device and a distance between said point of interest and said positioning device for at least one location of said plurality of different locations; and a transmitter for transmitting, to a data fusing processor, said collected data and said pointing input for: determining orientations and positions of said positioning device for said plurality of different locations of said positioning device in a global coordinate system (X.sub.1, Y.sub.1, Z.sub.1), determining the position and orientation of the positioning device at said at least one location based on the orientations and positions of the positioning device as determined at said plurality of different locations, and identifying said point of interest and determining said geospatial position of said point of interest in said global coordinate system based on said pointing input and the determined orientation and position of the positioning device in the global coordinate system as said at least one location.

12. The positioning device of claim 11, wherein said data fusing processor is an integrated part of said positioning device or wherein said data fusing processor is arranged at a remote sever or device.

13. The positioning device of claim 11, wherein said positioning device is configured to operate by performing steps comprising: collecting, by the data collector, the data from said GNSS receiving unit and the data from at least one of said imaging device or said IMU for each one of the plurality of different locations of said positioning device above said point of interest; transmitting, to the data fusing processor, said collected data for determining orientations and positions of said positioning device in the global coordinate system (X.sub.1, Y.sub.1, Z.sub.1) for at least three locations of said plurality of different locations of said positioning device; obtaining, by said data collector, from said laser-based pointing device, said pointing input including said sighting direction towards said point of interest from said positioning device and a distance between said point of interest and said positioning device for the at least one location of said plurality of different locations; determining the position and orientation of the positioning device at said at least one location based on the orientations and positions of the positioning device as determined at said at least three locations; and transmitting said pointing input to the data fusing processor for identifying said point of interest and for determining said geospatial position of said point of interest in said global coordinate system based on said pointing input and the determined orientation and position of the positioning device in the global coordinate system at said at least one location.

14. The positioning device of claim 11, wherein the spatial position of the antenna relative to said imaging device when capturing each one of said images, or said video, for the plurality of different locations of the positioning device is known and/or wherein the spatial position of the antenna relative to said IMU for the plurality of different locations of the positioning device is known.

15. The positioning device of claim 11, further including a display unit adapted to assist in capturing said series of images, or said video, and/or in identifying said point of interest.

16. A Method for determining a geospatial position of a point of interest using a positioning device, said method comprising: collecting, by a data collector of the positioning device, data from a global navigation satellite system (GNSS) receiving unit of said positioning device and data from an inertial measurement unit (IMU) of said positioning device for each one of a plurality of different locations of said positioning device above said point of interest, wherein the data from the GNSS receiving unit is obtained at an antenna of the GNSS receiving unit, and a spatial position of the antenna relative to the IMU is known; transmitting, to a data fusing processor, said collected data for determining orientations and positions of said positioning device in a global coordinate system (X.sub.1, Y.sub.1, Z.sub.1) for at least three locations of said plurality of different locations of said positioning device; obtaining, by said data collector, from a laser-based pointing device of the positioning device, a pointing input including a sighting direction towards said point of interest from said positioning device and a distance between said point of interest and said positioning device for at least one location of said plurality of different locations; determining a position and orientation of the positioning device at said at least one location based on the orientations and positions of the positioning device as determined at said at least three locations; and transmitting said pointing input to the data fusing processor for identifying said point of interest and for determining said geospatial position of said point of interest in said global coordinate system based on said pointing input and the determined orientation and position of the positioning device in the global coordinate system at said at least one location.

17. The method of claim 16, wherein said collected data includes GNSS data received at the GNSS receiving unit and gyroscopic and acceleration data received at the IMU for said plurality of different locations.

18. The method of claim 17, wherein said collected data is processed simultaneously by the data fusing processor for estimating the orientations and positions of said positioning device in said global coordinate system for said plurality of different locations of said positioning device.

Description

DESCRIPTION OF THE DRAWINGS

(1) Exemplifying embodiments will now be described in more detail, with reference to the following appended drawings:

(2) FIG. 1 shows a schematic view of a portable positioning device adapted to determine the geospatial position of a point of interest in accordance with some embodiments;

(3) FIG. 2 illustrates at least part of a workflow, or scenario, of a method for determining the geospatial position of a point of interest, in accordance with some embodiments;

(4) FIG. 3 shows a flow chart illustrating a general overview of a method for determining the geospatial position of a point of interest in accordance with some embodiments;

(5) FIG. 4 shows an example of a two-dimensional image captured by a portable positioning device in accordance with an embodiment;

(6) FIG. 5 shows an example of a 3D reconstruction generated by a portable positioning device in accordance with an embodiment;

(7) FIG. 6 illustrates the identification of a point of interest in accordance with an embodiment;

(8) FIG. 7 illustrates the identification of a point of interest in accordance with another embodiment;

(9) FIG. 8 illustrates the identification of a point of interest in accordance with yet another embodiment;

(10) FIG. 9 shows a schematic view of a portable positioning device in accordance with an embodiment;

(11) FIG. 10 shows a schematic view of a portable positioning device in accordance with another embodiment; and

(12) FIG. 11 is a flow chart illustrating the methods of the present disclosure.

(13) As illustrated in the figures, the sizes of the elements and regions may be exaggerated for illustrative purposes and, thus, are provided to illustrate the general structures of the embodiments. Like reference numerals refer to like elements throughout.

DETAILED DESCRIPTION

(14) Exemplifying embodiments will now be described more fully hereinafter with reference to the accompanying drawings, in which currently preferred embodiments are shown. The invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided for thoroughness and completeness, and fully convey the scope of the invention to the skilled person.

(15) With reference to FIG. 1, a portable positioning device 100 according to some embodiments of the present disclosure is described.

(16) FIG. 1 shows a portable positioning device 100 comprising a GNSS receiving unit 110 including an antenna 120, an imaging device 130, a display unit 140, an inertial measurement unit, IMU, or a number of gyroscope and accelerometers 170, a processor/processing unit or data collector 150 and a transmitter 155.

(17) FIG. 1 shows also a data fusing processor 190 in communication with the positioning device 100 via the transmitter 155. This is only for illustration purposes and the transmitter 155 may not necessarily be a separate entity. The data fusing processor 190 may, in some embodiments, be in direct communication with each one of the IMU 170, the imaging device 130 and the GNSS receiving unit 110.

(18) Further, although the data fusing processor 190 is shown to be at a remote location, such as for example a remote server of an internet cloud infrastructure, or a remote device in the embodiment of FIG. 1, the data fusing processor 190 may be an integrated part of the positioning device 100.

(19) In some embodiments, the data fusing processor 190 and the data collector 150 may be the same entity.

(20) The antenna 120 may have a phase center 115 and may be adapted to receive satellite information signals from a GNSS. One satellite 160 of the GNSS is depicted in FIG. 1 for illustration purposes. The antenna 120 may be adapted to receive signals from four or more space-based orbiting sources (or satellites) of the GNSS. The antenna 120 may for example include an antenna patch, a ceramic element, a low noise amplifier and filters. The GNSS antenna 120 may be lodged within a housing of the positioning device 100.

(21) The GNSS signals may for example be received from any GNSS such as GPS, GLONASS, Galileo, Compass/Beidou, QZSS, SBAS, IRNSS or the like. The antenna 120 may also be referred to as the GNSS antenna 120. The antenna 120 may be connected, or may be part of, a GNSS receiver or GNSS receiver unit or GNSS board 110. In some embodiments, the GNSS receiving unit 110 may include the GNSS antenna 120 and a processing unit, or processor, for computing a position of the antenna in the GNSS based on the signals received at the antenna. In some other embodiments, the processing unit of the receiving unit may be part of the processing unit 150 of the positioning device 100. The GNSS receiving unit 110 may therefore be adapted to transmit to the processing unit 150 of the positioning device 100 either the satellite information signals received at the antenna 120 or a position computed based on the received signals.

(22) The basic operation principle of a GNSS receiver, or positioning device based on GNSS signals in general, is to calculate its position by precisely timing the signals sent by satellites of the GNSS. Each of the messages broadcasted by the satellites includes a time stamp indicating the time the message was transmitted from the satellite and the satellite position when the message was transmitted. A distance to each of the satellites may then be derived based on the transit time of each message and the speed of light. Computation of these distances may result in the location (two- or three-dimensional position) of the positioning device, or in the present case the phase-center 115 of the antenna 120 of the positioning device 100.

(23) The imaging device 130 of the positioning device 100 is arranged at a certain position relative to the GNSS antenna 120 in the positioning device 100. In the present embodiment, the imaging device 130 is not aligned with the antenna 120 of the positioning device 100. The imaging device 130 may have an optical axis 135 as determined by, for example, the axis or line along which there is rotational symmetry in the imaging device 130. The optical axis 135 of the imaging device 130 may for example correspond to the axis passing through the center of a lens of the imaging device 130 or the axis passing through the center of the image sensor (not specifically shown in FIG. 1) of the imaging device 130. The optical axis 135 of the imaging device 130 may, in some embodiments, but not necessarily, correspond to the line of sight of the imaging device 130, which may also be referred to as the sighting axis. Although not necessary, the phase center 115 of the GNSS antenna 120 may, in some embodiments, be arranged along the optical axis 135 of the imaging device 130.

(24) The imaging device 130 may for example be a digital camera including an image sensor such as a semiconductor charge-coupled device (CCD), a complementary metal-oxide-semiconductor (CMOS) sensor or another active digital pixel sensor.

(25) The display unit 140 of the positioning device 100 may be provided for assisting an operator of the portable positioning device 100 in capturing a series of images of a scene including a point of interest 180 within a field of view 132 of the imaging device 130. The display unit 140 may also be used for assisting in identifying, or selecting, a point of interest in the scene, as will be further explained with reference to FIG. 8. FIG. 1 shows also a front view of the display unit 140 in which an image of the ground within the field of view 132 of the imaging device 130 is displayed. For illustration purposes, the point of interest 180 is identified by a triangle in the image displayed in the display unit 140.

(26) The IMU 170 may be adapted to provide acceleration and gyroscopic data about the positioning device 100.

(27) The data collector or processor 150 may be in communication with the imaging device 130, the GNSS receiving unit 110, the display unit 140 and the IMU 170. In particular, the data collector or processor 150 may be adapted to receive GNSS signals (or satellite information signals) from the GNSS receiving unit 110 or, as the case may be, directly a position of the GNSS antenna 120 of the GNSS receiving unit 110 as computed by the GNSS receiving unit 110.

(28) Further, the data collector or processor 150 may be adapted to control the imaging device 130 to cause the capture of one or more images in order to obtain a scan of a scene at which the point of interest 180 is located. The data collector or processor 150 may also be adapted to receive the images, or at least data corresponding to the images, captured by the imaging device 130.

(29) Similarly, the data collector or processor 150 may be adapted to receive data collected by the IMU 170. Further, the data collector or processor 150 may be adapted to control the information and/or images displayed by the display unit 140 and also adapted to receive information entered by an operator via the display unit 140.

(30) In other words, the positioning device 100 comprises a data collector or processor 150 and three different types of sensors including the imaging device 130, the GNSS receiving unit 110 and the optional IMU 170. The display unit 140 may be used to display information and/or to receive information.

(31) As will be described in the following, a geospatial position of a point of interest 180 in a scene may be determined based on data collected at the GNSS receiving unit and either one, or both, of data collected at the IMU 170 and images captured with the imaging device 130.

(32) Thus, still referring to FIG. 1, according to an embodiment, the imaging device 130 of the positioning device 100 may be optional and may more generally be a pointing device 130. In this embodiment, the GNSS receiving unit 110 and the IMU 170 are used as the primary detectors for determining the geospatial position of the point of interest.

(33) According to another embodiment, the IMU 170 of the positioning device may be optional and the GNSS receiving unit 110 and the imaging device 130 are used as the primary detectors for determining the geospatial position of the point of interest.

(34) According to yet another embodiment, as shown in FIG. 1, the positioning device 100 may include both an imaging device 130 and an IMU 170. The data collector 150 of the positioning device 100 may then be configured to collect the data received at the GNSS receiving unit 110 and either one, or both, of the data received at the IMU 170 and the imaging device 130 for obtaining a geospatial position of the point of interest 180.

(35) In the following, a procedure based on data collected at the GNSS receiving unit 110 and the imaging device 130 for determining a geospatial position of the point of interest 180 is first described with reference to FIGS. 2-5.

(36) FIG. 2 illustrates an embodiment of at least part of a workflow of a method for determining the geospatial position of a point of interest using a portable positioning device such as for example the positioning device 100 described with reference to FIG. 1.

(37) FIG. 2 shows a scenario in which the positioning device 100 is placed at four different positions for capturing four different images of a scene including a point of interest denoted 280. For illustration purposes, only a part of the positioning device 100 is represented in FIG. 2. In particular, the positioning device is represented by a plane 237 which may correspond to the image chip (or image sensor) in the imaging device 130 of the positioning device 100.

(38) In the embodiment shown in FIG. 2, the data collector or processor 150 of the positioning device 100 may cause the capture of four overlapping images of the scene at which the point of interest 280 is located, as represented by the overlapping zones 232a and 232c of the field of view of the imaging sensor 237. For this purpose, an operator may move at different places and capture a series of images, such as four in the present example, of the scene. The four different locations at which the four images of the scene are captured may correspond to the positions 239a, 239b, 239c and 239d of the projection center of the imaging device 130 (the lens being not shown in FIG. 2). It will be appreciated that another point of the imaging device may be taken as a reference such as for example the center of the imaging sensor 237.

(39) The positioning device 100, or rather the data fusing processor 190, may define an arbitrary local coordinate system (X.sub.2, Y.sub.2, Z.sub.2). As illustrated in FIG. 2, the arbitrary coordinate system does not need to be centered at one of the positions 239a, 239b, 239c or 239d of the imaging device 130 from which the images are captured. It will be appreciated that the positions 239a, 239b, 239c and 239d are arbitrary selected by the operator of the positioning device 100 when capturing the images, or a video, of the scene and the positions of the imaging device in the arbitrary coordinate system (X.sub.2, Y.sub.2, Z.sub.2) for the four different images are thus, as such, not known at the beginning of the method.

(40) FIG. 2 shows also only one satellite 260 from which GNSS signals may be sent to an GNSS antenna (not shown in FIG. 2) of the positioning device 100. As mentioned above, the GNSS receiving unit may receive signals from four or more satellites and the signals may be computed to determine the position of the phase center of the antenna 120 of the positioning device 100 in an absolute coordinate system (X.sub.1, Y.sub.1, Z.sub.1) relating to the GNSS.

(41) In the following, with reference to FIGS. 1-5, a method for determining a geospatial position of the point of interest 180, 280 will be described.

(42) It will be appreciated that, although described in a specific order in the following, the steps may be performed in another order. Generally, the method includes steps for determining a position of the point of interest in the local coordinate system (X.sub.2, Y.sub.2, Z.sub.2) and then derive the geospatial position of the target point (or point of interest) 280 in the absolute coordinate system (X.sub.1, Y.sub.1, Z.sub.1) from its position in the local coordinate system (X.sub.2, Y.sub.2, Z.sub.2).

(43) As illustrated in FIG. 3, the method includes a step 3100 of capturing, or causing the capture of, a series of images of the scene in which the point of interest 180, 280 is located. Step 3100 corresponds for example to the scenario depicted in FIG. 2 wherein four images are taken at four different positions of the positioning device 100. It will be appreciated that the use of four images is just an example and that at least three images may be captured.

(44) An example of a two-dimensional image of a path border captured by the imaging device 130 is shown in FIG. 4 for illustration purposes. The corner of the path border may be the point of interest 480 in the present example. The procedure may be repeated a number of times such that a plurality, or a series, of images of the path border and its surrounding is captured.

(45) Referring to FIG. 3, at 3200, the captured images (four in the present example) may then be orientated with respect to each other and a three-dimensional (3D) reconstruction of the scene may be generated, at 3250, using the orientated series of captured images.

(46) It will be appreciated that the captured images may be orientated in real time in the sense that they are successively orientated as they are captured. It is therefore not necessary to wait until all images have been captured. Further, the orientation of the images may, as such, not be static in that the orientation of an already orientated image may be refined, or readjusted, based on newly captured images.

(47) Different techniques may be employed for orientating the images of the scene captured by the imaging device.

(48) According to an embodiment, the captured images may be orientated by identifying common characteristic features among the captured images and/or by using the SFM technique or any other photogrammetric technique enabling to orientate images with respect to each other based on the content of the captured images themselves. Based on a recognition of characteristic features in the captured images, for example based on contrast/brightness values representative of different levels of light reflectivity of the objects of the scene, the images may be orientated.

(49) Turning to the example illustrated in FIG. 2, the series of images may not necessarily be captured in the order defined by the positions 239a, 239b, 239c and 239d of the imaging device 130. Assuming that the images are captured in a different order, and/or using different sighting directions, for example in a sequence at positions 239a, 239c, 239d and 239b, such photogrammetric techniques enable to orientate the images with respect to each other by recognizing characteristic features between the images. It will also be appreciated that it is beneficial if the images to be orientated with respect to each other overlap.

(50) According to an embodiment, if the positioning device 100 includes an IMU 170, the captured images may be orientated at 3200 based on acceleration and gyroscopic data received from the IMU 170 of the positioning device 100. In other words, in this embodiment, the images may be orientated based on data representative of the motion of the imaging device between the capture of different images instead of being based on the content of the images themselves. For example, from a first position of the IMU 170 when capturing the first image, the acceleration and gyroscopic data obtained from the IMU 170 enable to know the relative position of the imaging device when the next image is captured. Based on the data obtained by the IMU 170 for the series of captured images, the images can be orientated with respect to each other.

(51) Further, although the position of the imaging device 130 is represented to be fixed with respect to the position of the IMU 170 in the positioning device 100 shown in FIG. 1, the imaging device 130 may, in some other embodiments, be arranged in a movable or adjustable manner relative to the IMU 170 in the positioning device 100. In this case, the spatial position of the imaging device 130 relative to the IMU 170 may vary from the capture of one image to another. The position of the imaging device 130 relative to the IMU 170 may be known, for example using some kind of encoder (angle detector) of the positioning device or the like, and the observation of the IMU 170 for the different images may be corrected accordingly.

(52) Further, the observations of the IMU 170 when capturing the series of images may be used in combination with photogrammetric techniques for orientating the images in order to improve the orientation of the images.

(53) As mentioned above, the processing unit 150 may then be adapted, at 3250, to generate a 3D reconstruction of the scene based on the orientated series of images. FIG. 5 shows an example of the scene captured by images such as the image shown in FIG. 4, i.e. FIG. 5 shows a 3D reconstruction of the path border with its corner 580. The 3D reconstruction may for example be displayed at the display unit 140 of the positioning device 100.

(54) Turning again to FIG. 3, at 3300, positions of the antenna 120 in the global coordinate system for at least a subset of the captured images may be determined based on satellite information signals received at the GNSS antenna 120. In the present example, the 3D positions of the antenna 120 in the global coordinate system may be determined for at least three or four of the captured images. Accordingly, a first list with the 3D positions of the antenna in the global coordinate system for some of the captured images (the subset) is obtained.

(55) Further, the data fusing processor 190 may at 3400 define an arbitrary local coordinate system (X.sub.2, Y.sub.2, Z.sub.2), which is fixed with respect to the absolute coordinate system (X.sub.1, Y.sub.1, Z.sub.1) of the GNSS and may at 3450 determine the positions of the imaging device for at least some of the captured images, such as for example three or four of the images in the present example, in the local coordinate system (X.sub.2, Y.sub.2, Z.sub.2).

(56) As for the orientation of the images captured by the imaging device 130, the position of the imaging device 130 in the local coordinate system (X.sub.2, Y.sub.2, Z.sub.2) for three or more of the images may be determined by photogrammetry based on the generated 3D reconstruction, using for example a triangulation technique, and/or based on acceleration and/or gyroscopic data received from the IMU 170 of the positioning device 100.

(57) As a result, a second list with the 3D positions of the imaging device in the local coordinate system for at least some images of the subset is obtained.

(58) The data fusing processor 190 may then at 3500 determine a transformation function correlating a position of a point in the global coordinate system (X.sub.1, Z.sub.1) with a position of a point in the local coordinate system (X.sub.2, Y.sub.2, Z.sub.2) based on the determined 3D positions of the antenna 120 in the global coordinate system, the corresponding positions of the imaging device 130 in the local coordinate system (X.sub.2, Y.sub.2, Z.sub.2) for the images of the subset and a known spatial position of the imaging device 130 relative to the GNSS antenna 120 within the positioning device 100 for the captured images (the antenna offset). Although the spatial position of the imaging device relative to the GNSS antenna may vary from one image to another, in the present example the known spatial position of the imaging device 130 relative to the GNSS antenna 120 within the positioning device 100 is the same.

(59) In other words, the first list of 3D positions of the antenna in the global coordinate system, the second list of 3D positions of the imaging device in the local coordinate system for at least some images of the subset, and the known spatial position of the antenna relative to the imaging device within the portable positioning device when capturing each one of the images are used by the processing unit to establish the transformation function between the global coordinate system (the absolute coordinate system) and the local coordinate system.

(60) It will be appreciated that at least three non-identical images, and in particular non-collinear (i.e. not taken along the same sighting axis), may be used for the purpose of determining the transformation function for correlating the position of a point in the local coordinate system with the position of a point in the absolute coordinate system.

(61) The data collector or processor 150 may receive a pointing input such that the data fusing processor 190 can identify at 3600 the point of interest 580 in the generated 3D reconstruction 500 of the scene and determine at 3650 the 3D position of the point of interest in the local coordinate system based on the generated 3D reconstruction. The geospatial position of the point of interest in the global coordinate system may then be determined at 3700 based on the determined 3D position of the point of interest in the local coordinate system and the determined transformation function.

(62) The geospatial position of the point of interest may be its three-dimensional position in the global coordinate system but may also include only a two-dimensional position or only the altitude, as desired.

(63) As already mentioned, in some embodiments, the series of captured images may be a captured video of the scene including the point of interest.

(64) Further, the display unit may be configured to display a two-dimensional image 400 of the series of images, the 3D reconstruction 500 of the scene, the 3D position of the point of interest determined by the processing unit and/or an indication as to whether the GNSS receiving unit is activated.

(65) In the following, another procedure based on data collected at the GNSS receiving unit 110 and the IMU 170 for determining a geospatial position of the point of interest 180 is described.

(66) As for the procedure described above in which the data from the GNSS receiving unit 110 and images from the imaging device 130 are used to provide some kind of 3D reconstruction of the surrounding in which the point of interest is located, a first step is to determine the orientations and positions of the positioning device in the vicinity of the point of interest in the global coordinate system.

(67) For this purpose, the operator may, as shown in FIG. 2 for the embodiment based on the use of the imaging device data, position the positioning device at different locations above the point of interest. The data collector 150 may then be configured to collect the GNSS data received at the GNSS receiving unit 110 and the acceleration and gyroscopic data detected by the IMU 170 during this procedure.

(68) While the GNSS receiving unit 110 provides positions of the positioning device, the IMU 170 provides data representative of the motion of the positioning device between the capture of the different GNSS data. From a first reference position of the positioning device 100 (which may arbitrarily selected), the acceleration and gyroscopic data obtained from the IMU 170 enable to know the position and orientation of the positioning device relative to its position and orientation at the first reference position.

(69) Accordingly, the positions and orientations of the positioning device 100 in the vicinity, or above, a surface including the point of interest may be determined in the global coordinate system at any instant based on the data obtained by the IMU 170 and the GNSS receiving unit 110. The collection of the data may be made by the data collector 150 while the determination of the orientations and positions may be performed by the data fusing processor 190.

(70) The data collector 150 may then be configured to obtain a pointing input, in the form of for example a sighting direction and a distance, or two different sighting directions, in order to identify the point of interest. The pointing input may be obtained by means of a pointing device such as a laser rangefinder, a laser pointer (not shown) and/or the imaging device 130. The pointing input may be indicative of a position of the point of interest 180 relative to the positioning device 100.

(71) The pointing input may then be transmitted to the data fusing processor 190 for determining the geospatial position of the point of interest 180 in the global coordinate system.

(72) With reference to FIGS. 6-8, different techniques for identifying the point of interest at which the geospatial position is to be determined are described.

(73) According to an embodiment, FIG. 6 illustrates that the point of interest 180 may be identified as the intersection of two different sighting directions. For example, the point of interest may be identified by capturing at least two reference images, using at least two different sighting directions 610 and 620, of a portion of the scene at which the point of interest 180 is located. For this purpose, the imaging device may be equipped with a fiducial marker assisting the operator in sighting towards the point of interest. The point of interest can then be determined by the data collector or processor 150 or the data fusing processor 190 to be at the intersection between the two sighting directions 610 and 620.

(74) Although the embodiment shown in FIG. 6 is based on the use of an imaging device 130 as a pointing device, which is convenient in particular if the imaging device is also used together with the GNSS receiving unit for determining the orientations and positions of the positioning device in the global coordinate system, the pointing device may be another device having the possibility of providing sighting directions towards a point of interest such as a laser rangefinder or a laser pointer for example. The input from the pointing device may be collected at the data collector 150 and then transmitted by the transmitter 155 to the data fusing processor 190 for identifying the point of interest.

(75) In accordance with another embodiment, FIG. 7 illustrates that the point of interest 180 may be identified by capturing at least one reference image, using at least one sighting direction 730, of a portion of the scene at which the point of interest is located. The point of interest may be determined as the intersection of the sighting direction with a plane 750 representative of the 3D reconstruction.

(76) In the embodiment shown in FIG. 7, an imaging device 130 is used as a pointing device. Further, for the embodiment in which the imaging device is used together with the GNSS receiving unit for determining the orientations and positions of the positioning device in the global coordinate system, wherein a 3D reconstruction of the surrounding is generated, it is possible to identify the point of interest as the intersection of the sighting direction with a plane 750 representative of the 3D reconstruction.

(77) However, in other embodiments based on, for example, the use of the data from the IMU and the GNSS receiving unit, the pointing input may be a sighting direction, as shown in FIG. 7, and a distance from the positioning device to the point of interest (for the position from which the sighting direction is obtained).

(78) The pointing device may be a device providing a sighting direction and the possibility of measuring a distance such as a laser rangefinder for example. The input from the pointing device may be collected at the data collector 150 and then transmitted by the transmitter 155 to the data fusing processor 190 for identifying the point of interest.

(79) In accordance with another embodiment, FIG. 8 illustrates that the point of interest may be identified by displaying the 3D reconstruction 500 of the scene at the display unit 140 and by receiving an input indicating the point of interest in the 3D reconstruction. The display unit may be a touch screen with zooming function such that an operator can point at the point of interest, such as illustrated by the arrow in FIG. 8.

(80) Further, the point of interest may not necessarily be located in one of the images for which the position of the GNSS antenna has been obtained and/or for which the position of imaging device has been determined. As long as the point of interest is located in the 3D reconstruction, the position of the point of interest in the local coordinate system can be determined by photogrammetry and its absolute position can be determined using the transformation function.

(81) Again, although the determination of the geospatial position of the point of interest has now been described by procedures using data from the GNSS receiving unit 110 and either one of data from the IMU 170 and images captured with the imaging device 130, the geospatial position may be obtained by the data fusing processor 190 using all data collected at these three detectors.

(82) Further, it will be appreciated that the procedures may be complementary in the sense that the orientations and positions of the positioning device in the global coordinate system in the surrounding of the point of interest may first be determined using the imaging device 130 (and the GNSS receiving unit 110) and, if it becomes difficult or impossible to determine the orientations and positions based on the captured images, for example because of darkness and/or contrast issues, it is possible to continue the recording of the orientations and positions of the positioning device using the data received from the IMU (and vice versa).

(83) With reference to FIG. 9, a positioning device in accordance with another embodiment is described.

(84) FIG. 9 shows a portable positioning device 900 including a GNSS receiving unit 910 and a display unit 940. The portable positioning device 900 includes also a body 905 in which the processing unit (not shown) of the positioning device 900 may be arranged. Alternatively, the processing unit of the positioning device may be arranged in the same unit as the display unit, such as for example at the backside of the display unit 940. In the embodiment shown in FIG. 9, the body 905 is in the form of a cylinder which may be convenient to be handheld by an operator. However, other geometries and arrangements may be envisaged.

(85) In some embodiments, the element denoted 940 in FIG. 9 may be a smartphone including a display unit 940, a processing unit and an imaging device (not shown in this view). In the present example, the positioning device 900 may include the body 905 and a holder, or holding element (not denoted), attached to the body 905 and adapted to receive a unit including an imaging device, a display unit and a processing unit, such as e.g. a smartphone.

(86) Generally, the processing unit, the imaging device, the display unit, the GNSS receiving unit and the IMU of the positioning device 900 may be equivalent, on a functional point of view, to the processing unit 150, the imaging device 130, the display unit 140, the GNSS receiving unit 110 and the IMU 170 of the positioning device 100 described with reference to FIGS. 1-8. The characteristics described in the preceding embodiments for these elements therefore apply to the present embodiment.

(87) In the present embodiment, the GNSS receiving unit 910, or at least its antenna, is arranged to be positioned horizontally. For this purpose, the positioning device 900 may be equipped with a stabilization device 975 to maintain the GNSS in a horizontal position. In the present example, the stabilization device 975 may comprise a tilt sensor for detecting a deviation of the antenna of the GNSS receiving unit 910 from horizontal and a motorized system for rotating the GNSS receiving unit 910 such that it is maintained in a horizontal position.

(88) FIG. 9 illustrates also embodiments of the present disclosure in which the portable positioning device may be implemented based on an existing device 940 already including a processing unit, an imaging device and, optionally, a display unit, to which a module including the GNSS receiving unit is added. Expressed differently, embodiments of the present disclosure include an add-on module only including a GNSS receiving unit with its antenna, in which the processing unit of the existing device is adapted to operate in accordance with a method as defined in any one of the preceding embodiments.

(89) In the present embodiment, the processing unit of the existing device may function as the data collector 150 and/or the data fusing processor 190 described with reference to the preceding embodiments (see FIG. 1).

(90) FIG. 10 shows a schematic view of a portable positioning device 1000 in accordance with another embodiment.

(91) The portable positioning device 1000 may comprise a body including a first portion 1060 for holding the positioning device (for example by hand, such as a smartphone) and a second portion 1070 in which at least the GNSS antenna (or the GNSS receiving unit) is arranged. The imaging device 1030 may be provided in the first portion 1060.

(92) In the present embodiment, the first portion 1060 and the second portion 1070 are not mounted at a fixed angle with respect to each other but, instead, the first portion 1060 is connected to the second portion 1070 by means of a hinge 1080 to allow the second portion 1070 to swing or rotate with respect to the first portion 1060. The rotation of the second portion 1070 about the hinge 1080 is represented by the angle α formed between the first portion 1060 and the second portion 1070. The structural configuration of the positioning device 1000 may therefore vary from an unfolded configuration, in which the angle α is different from zero, and a folded configuration in which the first portion 1060 comes against the second portion 1070 such that the angle α is equal to, or at least close to, zero.

(93) Referring to FIGS. 1 and 11, a method for determining a geospatial position of a point of interest in accordance with some embodiments is described.

(94) The method comprises, at 1110, collecting, by a data collector 150 of a positioning device 100, data from the GNSS receiving unit 110 of the positioning device 100 and data from at least one of the imaging device 130 and the IMU 170 of the positioning device for a plurality of positions of the positioning device in the vicinity of the point of interest.

(95) The method may then comprise, at 1120, transmitting 1120 to a data fusing processor 190 the collected data for determining orientations and positions of the positioning device for the plurality of positions of the positioning device in a global coordinate system (X.sub.1, Y.sub.1, Z.sub.1).

(96) The method may then include, at 1130, obtaining by the data collector 150 a pointing input indicative of a position of the point of interest 180 relative to the positioning device 100 for at least one reference position of the positioning device. The pointing input may then be transmitted, at 1140, to the data fusing processor 190 for identifying the point of interest. The geospatial position of the point of interest in the global coordinate system may then be determined, at 1150, by the data fusing processor 190 based on the determined orientations and positions of the positioning device in the global coordinate system and the pointing input.

(97) The person skilled in the art realizes that the present invention by no means is limited to the preferred embodiments described above. On the contrary, many modifications and variations are possible within the scope of the appended claims.

(98) For example, the positioning device may include a plurality of imaging devices in order to improve its orientation in the surrounding of the point of interest. Using a plurality of imaging devices provides the benefit of providing an image covering a larger portion of the surrounding for each one of the locations at which the positioning device is held by the user (i.e. without being limited to the field of view of a single imaging device). For each of the plurality of positions of the positioning device in the vicinity of the point of interest, the imaging device(s) may be configured to provide a panoramic view or an image with an extended field of view to cover at least more than one direction.

(99) It will be appreciated for example that the point of interest does not necessarily need to be a terrain point. The present method and the positioning device thereof provide the benefit that the point of interest may be located anywhere, i.e. may be any point in a scene or environment captured by the imaging device in the series of images. For example, the point of interest may be located on the ground, on a wall, above the operator or elevated such as a point located on the roof of a building.

(100) Although features and elements are described above in particular combinations, each feature or element can be used alone without the other features and elements or in various combinations with or without other features and elements. In the above, a processor or processing unit may include, by way of example, a general purpose processor, a special purpose processor, a conventional processor, a digital signal processor (DSP), a plurality of microprocessors, one or more microprocessors in association with a DSP core, and any other type of integrated circuit (IC).

(101) Further, although applications of the positioning device have been described with reference to surveying systems, the invention may be used in other applications and/or systems.

(102) Additionally, variations to the disclosed embodiments can be understood and effected by the skilled person in practicing the claimed invention, from a study of the drawings, the disclosure, and the appended claims. In the claims, the word “comprising” does not exclude other elements, and the indefinite article “a” or “an” does not exclude a plurality. The mere fact that certain features are recited in mutually different dependent claims does not indicate that a combination of these features cannot be used to advantage.