APPARATUS, METHOD AND SYSTEM FOR MEASURING LOCATIONS ON AN OBJECT
20230088410 · 2023-03-23
Assignee
Inventors
- Arnoud Marc Jongsma (Leidschendam, NL)
- Dennis van Weeren (Leidschendam, NL)
- Mario Josephus De Bijl (Leidschendam, NL)
Cpc classification
G01C11/02
PHYSICS
G01B11/16
PHYSICS
International classification
Abstract
A system for monitoring survey reflectors arranged at a plurality of locations on an object, having: a camera, including: one or more light sources arranged to illuminate a field in space corresponding to at least 10% of a field of view of the camera, preferably the whole field of view; an image sensor receiving light beams from reflections of the beam by the survey reflectors and providing data; a body with an optical entry system, the image sensor located on a first side and the light source on a second side of the body; and a processing unit processing the data.
The processing unit is configured to determine locations of the survey reflectors from the image sensor data and detect movement of the survey reflectors based on a comparison of the determined locations with previously determined locations.
Claims
1. A system for monitoring survey reflectors arranged at a plurality of locations on an object, the system comprising: a camera, comprising: one or more first light sources each for emitting a first divergent beam having a solid angle (Ω1) larger than zero, wherein said one or more first light sources are arranged such that a field in space corresponding to at least 10% of a field of view of the camera is illuminated by said one or more first light sources; an image sensor for receiving reflected light beams comprising reflections of said first divergent beam by said plurality of survey reflectors and for providing image sensor data; and a body provided with an optical entry system, said body having a first side facing an interior space of said camera and a second side facing away from said interior space, wherein said image sensor is located in said interior space and said one or more first light sources are located on said second side of said body, wherein said one or more first light sources are arranged at a first distance (D1; d) from said optical entry system; and a processing unit configured for processing said data; wherein said processing unit is configured to determine a location of each survey reflector from said data and to detect a movement of one or more of said plurality of survey reflectors based on a comparison of the determined location of each survey reflector with previously determined locations of the survey reflectors.
2. The system of claim 1, wherein said optical entry system comprises a non-refractive optical element forming the objective of the camera.
3. The system according to claim 1, wherein said field in space corresponds to at least 50% of the field of view of the camera, wherein preferably said field in space is substantially equal to or larger than the field of view of the camera.
4. The system according to claim 1, wherein said processing unit is further configured for applying a first code to said first divergent beam by modulation of said first divergent beam, and for applying filtering techniques during image processing of the recorded data.
5. The system according to claim 4, further comprising one or more second light sources emitting a second divergent beam, wherein said one or more second light sources are arranged at a second distance from said optical entry system which is larger than said first distance, wherein said second distance is such that reflections of the second divergent light beams from the survey reflectors do not enter the camera through the optical entry system, and wherein said processing unit is further configured to apply a second code to the second divergent beam, wherein the second code is different from the first code.
6. The system according to claim 4, further comprising one or more third light sources configured to emit a third divergent beam, wherein said one or more third light source are arranged at a third distance from said optical entry system which is substantially similar to said first distance, and wherein the processing unit is further configured to apply a third code to said third divergent beam, wherein said third code is different from the first code, and, if used, is different from the second code.
7. The system according to claim 1, wherein the processing unit is further configured for applying a command code to said first divergent beam, the command code comprising instructions, information and/or requests to be sent to the survey reflectors.
8. The system according to claim 1, further comprising a survey reflector identification unit to be provided at or included in a survey reflector, the survey reflector identification unit comprising: a light receiver (71) for receiving the first divergent beam; a microcontroller coupled to said light receiver; and an identification unit light emitter, configured to emit a unique identification signal in response to said microcontroller receiving a command therefore.
9. A method for monitoring a plurality of survey reflectors provided at locations on an object, the method comprising: monitoring, at a first camera position, said locations by: emitting, by each one of one or more first light sources, a first divergent beam, having a solid angle larger than zero, towards said plurality of survey reflectors, wherein said plurality of survey reflectors are irradiated substantially simultaneously by said one or more first divergent beams, wherein said one or more first light sources are maintained in a substantially fixed position; recording, by an image sensor, data representing reflected light beams comprising reflections of said first divergent beam by said plurality of survey reflectors; and determining, by image processing of said data, a location of each survey reflector from said data and detecting a movement of one or more of said plurality of survey reflectors based on a comparison of the determined location of each survey reflector with previously determined locations of the survey reflectors.
10. The method according to claim 9, further comprising applying a first code to said first divergent beam by modulation of said first divergent beam, and applying a filtering technique when image processing said data, thereby distinguishing light originating from reflections of said first beam from other light sources and/or other reflections.
11. The method according to claim 10, further comprising: providing a body provided with an optical entry system allowing passage of light, said body having a first side and a second side, and arranging said body such that said first side faces an interior space of said camera and said second side faces away from said interior space, such that said image sensor is arranged in said interior space of said body and said one or more first light sources are arranged on said second side of said body; arranging said one or more first light sources at a first distance from said optical entry system; arranging one or more second light sources at a second distance which is larger than said first distance, wherein said second distance is such that reflections of the second divergent light beams from the survey reflectors do not enter the camera through the optical entry system, and emitting a second divergent beam by each of said one or more second light sources; and applying a second code to said second divergent beam by modulation of said second divergent beam, wherein said second code is different from said first code.
12. The method according to claim 11, wherein said optical entry system comprises a non-refractive optical element forming a camera objective.
13. The method according to claim 10, further comprising: Generating a first image from reflected light having the first code; Generating a second image from reflected light having the second code; and Subtracting the second image from the first image, and determining the positions and/or movements of one or more of the survey reflectors from resulting image.
14. The method according to claim 10, further comprising: providing one or more third light sources, each emitting a third divergent beam, wherein said one or more third light source are arranged at a third distance from said optical entry system which is substantially similar to a first distance at which said one or more first light sources are arranged, and applying a third code to the third divergent beam by modulation of said third divergent beam, wherein said third code is different from the first code, and, if present, from said second code; and determining the distance between the camera and a survey reflector from a distance between a first point, p1, on the image sensor originating from a reflection of light emitted from the first light source from the survey reflector and a second point, p2, on the image sensor originating from a reflection of light emitted from the third light source from the survey reflector.
15. The method according to claim 9, the method further comprising: monitoring said locations at a second camera position which is positioned at a distance from said first camera position, a first viewing line between said first camera position and a reference survey reflector oriented at an angle with respect to a second viewing line between said second camera position and said reference survey reflector; and determining three dimensional coordinates of said survey reflectors based on said monitoring at said first camera position and said second camera position; wherein said monitoring at said second camera position is performed similar to said monitoring at said first camera position.
16. A system for monitoring survey reflectors arranged at a plurality of locations on an object, the system comprising: a camera configured for monitoring said survey reflectors, said camera comprising: one or more first light sources each for emitting a first divergent beam having a solid angle larger than zero; one or more second light sources each for emitting a second divergent beam having a solid angle larger than zero; an image sensor for receiving reflected light beams comprising reflections of said first divergent beam by said plurality of survey reflectors and for providing data; and a body provided with an optical entry system, said body having a first side facing an interior space of said camera and a second side facing away from said interior space, wherein said image sensor is located in said interior space and said first and second light sources are located on said second side of said body; and a processing unit configured for processing said data; wherein said one or more first light sources are arranged at a first distance, from said optical entry system, and said one or more second light sources are arranged at a second distance from said optical entry system which is larger than said first distance, wherein the processing unit is configured to apply a first code to the first beam and a second code to the second beam, wherein the second code is different from the first code, wherein said first and second code are applied by modulation of said first and second divergent beams, and wherein said processing unit is configured for applying filtering techniques during image processing of the recorded data.
17. The system according to claim 16, wherein said one or more first light sources and said one or more second light sources are arranged such as to illuminate a field in space which corresponds to at least 10% of a field of view of the camera, such that said first and second beams are emitted towards a plurality of survey reflectors substantially simultaneously.
18. A method for monitoring locations on an object, the method comprising: providing a plurality of survey reflectors on said object, each survey reflector provided at one of said locations; monitoring said locations by: emitting, by each of one or more first light sources, a first divergent beam having a solid angle larger than zero towards said plurality of survey reflectors; emitting, by each of one or more second light sources, a second divergent beam having a solid angle larger than zero towards said plurality of survey reflectors; recording, by an image sensor, data representing reflected light beams comprising reflections of said first divergent beam by said plurality of survey reflectors; and determining, by image processing of said data, a location of each survey reflector from said data and detecting a movement of one or more of said plurality of survey reflectors based on a comparison of the determined location of each survey reflector with previously determined locations of the survey reflectors; wherein the method further comprises applying a first code to said first divergent beam and a second code to said second divergent beam, wherein said second code is different from said first code, wherein said first and second code are applied by modulation of said first and second divergent beams, and wherein said image processing comprises applying filtering techniques, such as to filter out data relating to reflections of said first divergent beam and/or said second divergent beam.
19. The method according to claim 20, wherein said image sensor is located in an interior space of said camera, at least partly defined by a body having a first side facing said interior space and a second side facing away from said interior space, said body comprising an optical entry system, wherein said first and second light sources are located on a second side of said body, and wherein said first light source is arranged at a first distance from said optical entry system, and said second light source is arranged at a second distance from said optical entry system which is larger than said first distance, wherein said first distance allows reflections of said first divergent beam from said survey reflectors to pass through said optical entry system, and wherein said second distance does not enable reflections of said second divergent beam from said survey reflectors to pass through said optical entry system.
20. The method according to claim 18, further comprising: Generating a first image from reflected light having the first code; Generating a second image from reflected light having the second code; and Subtracting the second image from the first image, and determining the positions and/or movements of one or more of the survey reflectors from resulting image.
21. Survey target unit for use with the system of claim 1.
22. Survey target unit according to claim 21, wherein the survey reflectors are arranged in an array having one of said survey reflectors arranged in a centre of said array, said centre representing a point of symmetry of said array.
23. Survey target unit according to claim 21, wherein the survey reflectors are arranged in a hexagonal array pattern.
24. Survey target unit according to claim 21, each survey reflectors having a surface configured for receiving an incoming light beam, wherein said surface is substantially circular.
25. Survey target unit according to claim 21, wherein the plurality of survey reflectors are realized by a plurality of survey prisms the prisms preferably being substantially identical, and wherein a surface of the prisms configured for receiving an incoming light beam are arranged in said one single plane.
26. Survey target unit according to claim 21, wherein the plurality of survey reflectors are realized by a plurality of hollow mirrors each having a centre point, the hollow mirrors preferably being substantially identical, and wherein the centre point of all mirrors are arranged in said one single plane.
27. Survey target unit according to claim 21, wherein the plurality of survey reflectors comprises 13 to 35 reflectors.
28. (canceled)
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0165] Embodiments of the present disclosure will be described herein below with reference to the accompanying drawings. However, the embodiments of the present disclosure are not limited to the specific embodiments and should be construed as including all modifications, changes, equivalent devices and methods, and/or alternative embodiments of the present disclosure.
[0166] The terms “have,” “may have,” “include,” and “may include” as used herein indicate the presence of corresponding features (for example, elements such as numerical values, functions, operations, or parts), and do not preclude the presence of additional features.
[0167] The terms “A or B,” “at least one of A or/and B,” or “one or more of A or/and B” as used herein include all possible combinations of items enumerated with them. For example, “A or B,” “at least one of A and B,” or “at least one of A or B” means (1) including at least one A, (2) including at least one B, or (3) including both at least one A and at least one B.
[0168] The terms such as “first” and “second” as used herein may modify various elements regardless of an order and/or importance of the corresponding elements, and do not limit the corresponding elements. These terms may be used for the purpose of distinguishing one element from another element. For example, a first element may be referred to as a second element without departing from the scope the present invention, and similarly, a second element may be referred to as a first element.
[0169] It will be understood that, when an element (for example, a first element) is “(operatively or communicatively) coupled with/to” or “connected to” another element (for example, a second element), the element may be directly coupled with/to another element, and there may be an intervening element (for example, a third element) between the element and another element. To the contrary, it will be understood that, when an element (for example, a first element) is “directly coupled with/to” or “directly connected to” another element (for example, a second element), there is no intervening element (for example, a third element) between the element and another element.
[0170] The expression “configured to (or set to)” as used herein may be used interchangeably with “suitable for” “having the capacity to” “designed to” “adapted to” “made to,” or “capable of” according to a context. The term “configured to (set to)” does not necessarily mean “specifically designed to” in a hardware level. Instead, the expression “apparatus configured to . . . ” may mean that the apparatus is “capable of . . . ” along with other devices or parts in a certain context.
[0171] The terms used in describing the various embodiments of the present disclosure are for the purpose of describing particular embodiments and are not intended to limit the present disclosure. As used herein, the singular forms are intended to include the plural forms as well, unless the context clearly indicates otherwise. All of the terms used herein including technical or scientific terms have the same meanings as those generally understood by an ordinary skilled person in the related art unless they are defined otherwise. The terms defined in a generally used dictionary should be interpreted as having the same or similar meanings as the contextual meanings of the relevant technology and should not be interpreted as having ideal or exaggerated meanings unless they are clearly defined herein. According to circumstances, even the terms defined in this disclosure should not be interpreted as excluding the embodiments of the present disclosure.
[0172] For the purpose of determining the extent of protection conferred by the claims of this document, due account shall be taken of any element which is equivalent to an element specified in the claims.
[0173] The present invention will be discussed in more detail below, with reference to the attached drawings, in which:
[0174]
[0175]
[0176]
[0177]
[0178]
[0179]
[0180]
[0181]
[0182]
[0183]
[0184]
[0185]
[0186]
[0187]
[0188]
[0189]
DESCRIPTION OF EMBODIMENTS
[0190] In general, the present invention relates to surveying objects or tracking of movement of objects by tracking one or more survey reflectors attached to the object. More specifically, the present invention is directed to apparatuses comprising cameras provided with one or more light sources configured for emitting a divergent light beam, for surveying and/or tracking of positions on objects.
[0191] Although the illustrated embodiments are described using a camera having an optical entry system, i.e., camera objective, formed by non-refractive element in the form of a pin-hole at the camera objective, it should be understood that the non-refractive element may alternatively be any of the non-refractive elements mentioned herein above. Alternatively, the optical entry system can be formed by a lens system, such as a lens system comprising a single lens. Analogously, although the embodiments are described using a prism as the survey reflector, it should be understood that a different reflective element, for example another type of prism or a hollow mirror, could also be used.
[0192]
[0193] The object 3 is monitored by monitoring or measuring the positions of the survey reflectors 1. By monitoring their positions over time, movement of the whole or parts of the object 3 can be detected. Preferably also an amount, degree and/or direction of the movement can be determined. Thereby, the status, such as the stability or integrity, or the mechanical properties, of the object 3 can be monitored.
[0194] One camera 7 is shown. However, the system may comprise more than one camera 7.
[0195] According to the invention, the camera 7 is arranged to generate and transmit a diverging light beam 5, also referred to as first beam, to the plurality of survey reflectors 1. The survey reflectors 1 reflect the part of the diverging light beam 5 impinging thereon, thereby forming reflected beams 6 which are reflected back to the camera 7. As will be described in more detail further herein below, the light beam 5, generally substantially cone-shaped, has a solid angle, Ω1, covering the field of view of the camera 7. Thereby, the plurality of survey reflectors 1 can be monitored substantially simultaneously.
[0196]
[0197]
[0198] The survey reflectors 1 illustrated in
[0199]
[0200] The system 20 comprises a camera 27 and a processing unit 29, which may be comprised or arranged within the camera 27. Alternatively, it may be arranged remotely from the camera 27.
[0201] The camera 27 comprises a first light source 22 emitting a diverging beam 25, also referred to as first divergent beam. The first light source 22 generally comprises a light emitting diode, LED. The first beam 25 has a first solid angle, Ω1, which is preferably large enough to cover substantially the entire field of view of the camera 27. Alternatively, as described above, a plurality of first light sources 22 may be provided, in order to cover substantially the field of view of the camera. In such embodiment, the solid angle of each beam does not necessarily cover the field of view, as long as the assembly of beams substantially cover the field of view. Thereby, all survey reflectors 21 located within the field of view of the camera, i.e., seen by the camera, are irradiated with the first beam 25 without moving, rotating or scanning the camera or light beam (with the possible exception of one or more survey reflectors being shadowed by an obstacle, such as a pedestrian or vehicle in the case of monitoring a building as shown in
[0202] For some embodiments, the first divergent beam 25, or the accumulation of first divergent beams emitted by a plurality of first light sources 22, only partially covers the field of view of the camera. For many surveying applications this may be sufficient. The partial coverage may, for example, be at least 10%, or more, in some embodiments at least 50%, depending on the application.
[0203] The first beam 25 is preferably amplitude modulated, thereby exhibiting a defined variation in time of its amplitude. Alternatively and/or additionally, other types of coding may be applied to the first beam, as described in more detail in the Summary section herein above. By applying appropriate filtering techniques during image processing of the data, as described above, environmental influences on the measurements, such as interference by ambient light, can be reduced.
[0204] The survey reflector 21 will reflect the part of the first beam 25 which it receives, forming a reflected beam 26 which is reflected back towards the camera 27.
[0205] The apparatus 20 further comprises an image sensor 24, arranged for receiving reflected light, i.e. the part 261 of the reflected beam 26 which enters the camera 27. As a result of the reception of the reflected light 261, the image sensor 24 generates data, in preferred embodiments in the form of a two-dimensional image.
[0206] Between the image sensor 24 and the first light source 22, or at least the emitting surface thereof, a body 28 is arranged, which in the illustrated embodiment is substantially planar. The body 28 is non-transparent to light, and comprises an optical entry system, in the illustrated embodiment in the form of a non-refractive element such as pinhole 23, forming the objective of the camera. In the illustrated embodiment, the body 28 forms part of a housing of the camera.
[0207] Although the description herein will be focused on the optical entry system being formed by a pinhole, other types of non-refractive elements, in particular as described in WO 2019/143250 A1, may be equally well suitable, as may refractive elements, such as a single thin lens.
[0208] The processing unit 29 is configured to determine, generally by image processing of the data provided by the image sensor, a location of each survey reflector from the data and to detect a movement of one or more of the plurality of survey reflectors based on a comparison of the determined location of each survey reflector with previously determined locations thereof.
[0209] By simultaneously irradiating all survey reflectors located within the field of view of the camera with the divergent beam generated by the one or more first light sources, all survey reflectors can be measured substantially simultaneously, without the need to move the camera and/or scan the first beam. Thereby, the complexity of the system and the time required for measurement can be reduced. The survey reflectors are passive reflective elements, hence do not require any power source.
[0210] The first light source 22 is arranged at a first distance, D1, from the pinhole 22. To an approximation, the first distance D1 is smaller than a diameter, such as a diameter, D, of the surface area, or aperture, of the survey reflector 21 on which the first beam 26 is incident. If the light source would be located outside this radius, the observer, i.e. the pinhole 23, would not be able to ‘see’ the reflection of the first light source, as follows from the principles of optics. The light-source 22 can, to a good approximation, be assumed to be a point-source. A perfect survey prism will reflect a diverging beam emitted by the first light source (if the beam-width is wide enough to cover the entire aperture of the prism) back to the (point) source. The apparent distance of the reflected source to an observer, located at the same position as the source, is twice the distance of source to prism. The beam-width, or solid angle, (in degrees or steradians) of the reflected beam is limited by the aperture of the prism and the distance between the source and the prism. The diameter of the reflected beam spot at the location of the source is twice the diameter of the aperture of the prism (this is because the virtual source 22v is at twice the distance from source to prism). The center of the spot of the virtual source 22v is located at the same position as the point source 22. Therefore, an observer (or light-receiver) at the same distance from the prism as the light-source, will only be able to ‘see’ the reflected source if he (or the light receiver) is located within a radius of the diameter of the prism from the light-source. In other words, reflections 251 of the first beam 25 in the prism 21 will only appear at the pinhole 23 if the first light source 22 is located within a distance from the pinhole not more than the diameter of the prism. This principle will be used in different embodiments for distinguishing and/or reducing the influences of ambient light and/or reflections of the first beam from other surfaces than the survey reflectors, as will be described in more detail further below.
[0211]
[0212] Elements or surfaces, other than the survey reflector 31, located in the field of view of the pinhole 33 and irradiated by the first beam 321, may also reflect light, either diffusely or as a mirror like surface. These reflections may interfere with the reflections from the prism 31 or even generate false targets. In order to obtain reliable measurement results, i.e., reliable and relevant monitoring of the object to be monitored, it is desired to be able to distinguish between such unwanted reflections and reflections of the first beam from the prisms.
[0213] This can be achieved by applying a first code to the first beam, for example by amplitude modulating the first light beam 351, emitted from the first light source 321, with a specified modulation frequency, and providing a second code to the second beam, As described above, only reflections of the first light source 321, and not of the second light source 322, in the prism, will reach the pinhole. Other elements in the field of view will reflect light of both sources back into the pinhole.
[0214] By giving the light emitted by first and the second light sources 321, 322, respectively, different codes, code 1 and code 2, respectively, signals originating from the first and second light sources, respectively, can be separated during processing of the data from the image sensor. This has been described in more detail in the summary section herein above.
[0215] By subtracting the two images obtained after filtering for the first and second code, respectively, the reflection caused by the prism 31 can be obtained. From the image resulting from the subtraction, the position and/or movement of the prism 31 can be determined.
[0216] Alternatively, an optical bandpass filter might be provided at, i.e. in front of, the prism 31, allowing passage of light of one wavelength but not the other. The first light beam, having a first wavelength and provided with a first code, will be reflected by the prism, but not the second light beam, having a second wavelength and provided with the second code. Thereby, the prism will only reflect light from the first light source 321, whereas other elements in the field of view of the camera will reflect both light from the first light source 321 and from the second light source 322. This method will work also in case the second light source would be located at a distance from the pinhole 33 smaller than the diameter of the prism.
[0217] In a further embodiment, illustrated in
[0218] In the embodiment illustrated in
[0219] As illustrated in
[0220] If reflected from a prism 51, as illustrated in
[0221] However, if reflected from a flat mirror 5101, as illustrated in
[0222] In the case of a hollow, i.e., concave, mirror surface 5103, as illustrated in
[0223] Further, from the distance between the two images, or targets, recorded by the image sensor 54 in the embodiment illustrated in
[0224] Another measurement set-up, enabling determining the distance between the camera and the different survey reflectors is illustrated in
[0225] Furthermore, in each of the embodiments described above, it may be desirable to identify each of the survey reflectors prior to starting a measurement or monitoring session, and/or while monitoring a set of survey reflectors over time, in order to distinguish reflections from the different survey reflectors from one another. This can be enabled by an embodiment as illustrated in
[0226] As illustrated in
[0227] The survey reflector 71 and the survey reflector identification unit 716 may be arranged in a survey reflector unit 718, which may also be referred to as a beacon.
[0228] The object 3 may be provided with a plurality of survey reflector units 718 at different locations thereof, for monitoring these locations. In the illustrated embodiment, the survey reflector 71 is provided by a prism. However, it can be understood that another type of reflector may be equally suitable, for example a hollow mirror, also referred to as a cat's eye.
[0229] By configuring the processing unit 79 to cause the camera 77, in particular the light source 72 thereof, to send out an identification request to all survey reflector units within its reach, each of these can be uniquely identified by an identification signal sent out by the identification unit 716 of the respective survey reflector units 718.
[0230] Now, the components according to embodiments of the camera 7, will be described in more detail, in particular with respect to their functional aspects. This description applies analogously to all cameras 27, 37, 47, 57, 67A, 67B, 77 described in the different embodiments herein above.
[0231]
[0232] All connections intended for transmission of data may be physical connections (wires) however, alternatively they may be wireless and based on transmission of electromagnetic/light radiation.
[0233] The non-refractive optics 101 may be any of the types of non-refractive optical elements as described herein above. In preferred embodiments, the non-refractive optics may comprise one or more pinholes. The diameter of a pinhole may be in a range between 50 and 400 μm. Alternatively, as described above, the non-refractive optics may be replaced by a lens, which preferably is a thin lens allowing temperature modulation at low computation efforts.
[0234] The processing unit 9 may be any suitable processing unit known from the art.
[0235] The image sensor 120 preferably comprises a set of light sensitive elements (pixel) arranged in a 2D matrix forming a camera's image plane, like a CCD-sensor or a CMOS-sensor. The image sensor 120 is arranged to receive the light beams 6 as entered through the non-refractive optics 101. Each light beam 6 will be focussed on a subset of these light sensitive elements. Each such subset corresponds to a solid angle of one incoming light beam 6, i.e., both an angle of incidence in a horizontal and an angle of incidence in a vertical plane relative to the earth. Angles of incidence can, of course, also be measured relative to another object than the earth, like a geostationary satellite. As long as both the camera 7 and the survey reflectors 1 remain at fixed positions, these subsets are static per survey reflector 1.
[0236] In an alternative embodiment a line sensor can be used in combination with an optical slit as objective, rather than a pinhole. The optical slit, in such an embodiment, is oriented essentially perpendicular to the line sensor's lateral direction. Such alternative embodiments can provide measurements of angles in one dimension. In order to increase the number of dimensions available to be measured, two or more of such devices equipped with line sensors can be arranged in various different orientations. For example, two of such devices can be arranged in a perpendicular fashion, thereby allowing for measurements, similar to measurements performed with a 2D matrix sensor. Such linear sensor arrangements would have the advantage of consuming substantially less power than a device employing a 2D matrix sensor.
[0237] Optionally, a temperature control system 103 may be provided, in order to reduce thermal influences on the measurement data. The thermal capacity of the non-refractive optics 101 is relatively low when compared to a camera 7 using a lens system instead of the non-refractive optics 101. Thermal stability can be improved by implementing a temperature control system in the form of a thermostat 103.
[0238] In the below, some general aspects of the systems described herein above and methods of operation thereof will be summarized.
[0239] If the system is equipped with two or more cameras, e.g. as illustrated in
[0240] The image sensor 24, 34, 44, 54, 74, 120 converts the received light beams 6 into an image. The image is a set of electronic signals, here called pixel signal. Each pixel signal is generated by one light sensitive element and has a value depending on the light intensity of light received by the light sensitive element. Thus, the pixel signals may also relate to the object 3 to which the survey reflectors 1 are attached and its surroundings.
[0241] The image sensor is positioned such that the light entering the camera through the non-refractive element forms a diffraction pattern on the image sensor. The diffraction pattern will depend on the properties of the non-refractive element, and will show up as dark or bright regions on the image sensor depending on the distance and angle of the respective pixels of the image sensor to the non-refractive element. By integrating a plurality of data frames, each comprising a number of pixels, typically at least 100, measurement results of high resolution can be achieved.
[0242] In embodiments using refractive optics, such as a lens, the image sensor is preferably, positioned such that its light sensitive elements are in the vicinity of the focal plane of the lens. In another preferred embodiment, the image sensor 120 is positioned at a position within the focal distance of the lens such that the image is de-focused to a certain amount, resulting in a beyond infinity focus condition. In such an embodiment, the image processing may include super-resolution imaging based on defocusing techniques, thereby enabling sub-pixel resolutions. A resolution of 1/100 or even better of a pixel can then be obtained.
[0243] The processing unit 9 is arranged to receive the pixel signals from the image sensor 120 and store them in memory 15. The pixel signals may be stored by processing unit 9 as a single picture, preferably with a time stamp and/or position stamp indicating the position of camera 7. However, preferably, the pixel signals are stored by processing unit 9 as a series of pictures together forming a video, in which each picture is provided with a time stamp and/or position stamp indicating the position of camera 7.
[0244] Clock 23 provides clock signals to processing unit 9, as known to a person skilled in the art. The clock signals are used for the normal processing of processing unit 9. Processing unit 9 may base the time stamp on these clock signals. However, camera 7 may also be equipped with a GNSS unit receiving time signals from a satellite or may receive time signals from another suitable source.
[0245] Memory 15 may comprise different types of sub-memories, like ROM (Read Only Memory)/Flash types of memory storing suitable program instructions and data to run the processing unit 9. Also, memory will comprise suitable RAM (Random Access Memory) types of memory for storing temporary data like the data received from image sensor 120. Memory 15 may also comprise cache type memory. Some or all of the sub-memories may be physically located remote from the other components. Processing unit 9 may also be arranged to send all pixel signals to a remote unit via electronic networking module(s) 20 for external storage and processing. A local copy of these pixel signals may then, but need not be, stored in a local memory 15 within camera 7.
[0246] Memory 15 stores initial position data indicating the initial position of camera 7. Such initial position data may have been established by using a theodolite and then be stored by a user. Such initial position data can also result from a measurement made by the camera 7 itself. E.g., the camera 7 can collect consecutive pictures from known “blinking” light sources installed on tall air traffic obstacle markers having well known locations. Such obstacle markers may be placed in defined vertical distances on tall structures and thereby allow for triangulation. Memory 15 also stores a camera ID identifying camera 7 and being used by processing unit 9 in external communications with other devices to identify itself to those other external devices.
[0247] Position and/or orientation measurement components 16 may include one or more accelerometers and/or gyrometers/gyroscopes, as is known to a person skilled in the art. They may also include the above mentioned GNSS unit. Such accelerometers and/or gyrometers/gyroscopes measure the camera's own motion and derive an updated camera position and orientation from such measurements. The updated camera position and/or orientation is then stored by processing unit 9 in memory 15. By doing so, changing camera positions and/or orientations can be taken into account when measuring the position of the one or more survey reflectors 1. Accuracy may be in the order of a few 1/1000 degrees. Tests have shown 2 milli degrees peak-to-peak. Moreover, a three-axis accelerometer package can also measure the direction of earth gravity when static. A 3D gyro package of sufficient performance can measure the direction of the earth rotation axis (also when static).
[0248] Output unit 17 may comprise one or more sub-output-units, like a display and a speaker.
[0249] Input unit 19 may comprise one or more sub-input-units like a keyboard and a microphone. The display and keyboard may be made as two distinct touch screens. However, they may also be implemented as a single touch screen.
[0250] Electronic networking modules 20 may comprise one or more of LTE (Long Term Evolution), Ethernet, WiFi, Bluetooth, Powerline communication, Low Power Wide Area Network (e.g. Lora™ and Sigfox™), and NFC (Near Field Communication) modules. Technology known from the IoT (Internet of Things) may be used, as well as any proprietary communication protocol.
[0251] The at least one light source 102 comprises at least one light source like a Light Emitting Diode (LED) source configured to generate light. Processing unit 9 is arranged to control each LED source such that they generate a light beam.
[0252] As shown in
[0253] The basic idea of the apparatus, and the method for monitoring an object using the apparatus, is that camera 7, or any one of the cameras 27, 37, 47, 57, 67A, 67B, 77 described in the different embodiments herein, is arranged on a fixed position such that it is static. Then, the static position is known and stored in memory 15 accessible by processing unit 9 in camera 7.
[0254] When all survey reflectors 1, or, equally, survey reflectors 21, 31, 41, 51, 61, 71 described in the various embodiments herein above, or the target units 1100, 1101, 1102, 1103 described herein further below and illustrated in
[0255] Thus, when the system starts, the camera knows all initial positions of survey reflectors which correspond to an initial position and orientation of object 3 to which the survey reflectors are attached.
[0256] The processing unit 9 is arranged to calculate an initial solid angle of incidence of each of the reflected light beams 6. I.e., received reflected light beams are imaged, via the non-refractive optics, on one or more light sensitive elements of image sensor 120. Processing unit 9 determines which one these light sensitive elements are and then establishes the solid angle of incidence of the corresponding light pulse. Techniques to do so are known to persons skilled in the art and need no further detailed explanation here.
[0257] When the object 3 is stable, i.e., does not move, the positions of all survey reflectors 1 are also stable. Consequently, the solid angle of incidence of each reflected light beam on the camera's image sensor is fixed. However, as soon as the object 3 moves, or parts thereof, this solid angle of incidence of the reflected light beams 6 changes. The processing unit 9 is arranged to calculate this change of the solid angle per light beam 6.
[0258]
[0259] The camera 7 receives the reflected light beams 6 from a survey reflector 1, 1100, 1101, 1102, 1103, 1401, 1402, 1403 that is projected onto the image sensor 120.
[0260] The first step, 1001, in the processing is to record, or capture, at least two, but preferably many images frames, or raw data frames, in a sequential order. Each image frame is essentially a 2D array of light values. By capturing a sequence of image frames, a 3D matrix of light values is formed. The axes in the 3D matrix are X, Y and time T. In one embodiment a sequence of 100 images are captured with an interval of 1/60 s.
[0261] In the steps 1002, digital processing is applied to the sequence of image frames, in order to enhance the data relating to one specific light source, or coded light beam, while suppressing influences of other light received by the image sensor, as described herein above. The output of this process is a 2D image. Multiple light sources with different and unique codes can be processed using the same 3D matrix of light values over time and each will produce a unique 2D image.
[0262] In step 1002a, the processed 2D image may comprise both the (diffusely) reflected light of the environment and also the reflections from the survey reflectors of the divergent light beam emitted by the one or more first light sources.
[0263] In step 1002b, the 2D processed image relating the light emitted by the second light sources may contain (diffusely) reflected light of the environment but not from the survey reflectors, as described with reference to
[0264] In step 1002c, which may be optional, the processed 2D image may equally contain both the (diffusely) reflected light of the environment and also the survey reflectors, but since it comes from a light source, e.g. the one or more third light sources, with a different position than the one in 1002a, the position of the survey reflectors in the 2D image will be slightly shifted compared to the image obtained in step 1002a.
[0265] In step 1003, the two 2D images, obtained in step 1002a and 1002b, respectively, are subtracted from one another, in order to only reveal the reflected light of one particular light source by the survey reflectors. By subtracting the image obtained in step 1002b, and, optionally, the image obtained in step 1002c, from the image obtained in step 1002a, the reflections of the light emitted by the first light source from the survey reflectors can be enhanced, while suppressing influences from ambient light and from diffusely reflected light of the other light sources. The result is a single 2D image representing substantially only reflections of the first divergent light beam in the survey reflectors, while other influences have been suppressed.
[0266] In step 1004, the positions of the survey reflectors, and/or movement of one or more of the monitored survey reflectors, are determined. Light emitted by the specific light source, for example the first light source, or light having a specific coding, for example the first coding, reflected from the survey reflectors generate, for each survey reflector, a feature, or blob, preferably comprising a plurality of pixels, and, in the case of a non-refractive optics, a diffraction pattern. By correlating the features or patterns in the 2D image obtained at step 1004 with predetermined (e.g. default) features or diffraction patterns and/or with previously measured features or diffraction patterns, the positions of the survey reflectors, and/or movement of one or more thereof, can be determined.
[0267] In step 1005, the obtained data, i.e., the precise 2D coordinates of each of the reflectors, are made available to other processes. These may include a further process to correct for errors such as camera motion, temperature compensation etc., a process to estimate distance based on two light sources, storage of data, displaying of data, etc.
[0268]
[0269] The front face of the survey reflectors, receiving the incoming light, may be circular, whereby the hexagonal arrangement provides the most efficient arrangement of the reflectors, as illustrated in
[0270] A plurality of target units 1100 illustrated in
[0271]
[0272]
[0273]
[0274]
[0275]
[0276] In the embodiments illustrated in
[0277]
[0278] It will be clear to a person skilled in the art that the scope of the invention is not limited to the examples discussed in the foregoing, but that several amendments and modifications thereof are possible without deviating from the scope of the invention as defined in the attached claims. While the invention has been illustrated and described in detail in the figures and the description, such illustration and description are to be considered illustrative or exemplary only, and not restrictive. The present invention is not limited to the disclosed embodiments but comprises any combination of the disclosed embodiments that can come to an advantage.
[0279] Variations to the disclosed embodiments can be understood and effected by a person skilled in the art in practicing the claimed invention, from a study of the figures, the description and the attached claims. Features of the above described embodiments and aspects can be combined unless their combining results in evident technical conflicts.