A METHOD OF OPERATING A TIME OF FLIGHT CAMERA
20230095342 · 2023-03-30
Assignee
Inventors
Cpc classification
G01S17/894
PHYSICS
G01S7/4915
PHYSICS
G01S17/32
PHYSICS
International classification
G01S17/894
PHYSICS
G01S7/4915
PHYSICS
H04N17/00
ELECTRICITY
Abstract
In one aspect the invention provides a method of operating a time of flight camera which includes the steps of capturing a sequence of time of flight camera data frames using a set of step frequency modulation signals to provide a time of flight camera data set, then completing a spectral analysis of the dataset which identifies frequency and phase value pairs indicative of the range of the camera to an object represented in the data frames. Next an estimated camera range value to an object represented in the data frames is determined using the frequency value, then a corrected camera range value is determined using the estimated camera range value and the phase value. A camera output is then provided which identifies the corrected range values of at least one object represented in the data frames of the dataset.
Claims
1. A time of flight camera which includes a signal generator configured to generate a source modulation signal and to modify the frequency of the source modulation signal by at least one multiple of an offset frequency, a camera light source configured to transmit light modulated by a modulation signal generated by the signal generator a camera sensor configured to capture time of flight camera data frames from received reflected light, a processor configured to compile a data set from captured time of flight data frames and to complete a spectral analysis of the received dataset which identifies frequency and phase value pairs indicative of the range of the camera to an object represented in the data frames, and determine an estimated camera range value to an object represented in the data frames using the frequency value, and determine a corrected camera range value using the estimated camera range value and the phase value, and providing a camera output which identifies the corrected range values of objects represented in the data frames of the dataset.
2. The time of flight camera of claim 1 wherein the processor is configured to apply a calibration to the frames of the captured data set or during the capture of the data set so that the results of the spectral analysis yields a zero phase value when interpolated to a zero frequency value.
3. The time of flight camera of claim 2 wherein the calibration applied specifies a rotation to be applied to a phase value associated with each modulation frequency used to capture a data frame,.
4. The time of flight camera of claim 2 wherein the calibration applied implements a windowing function.
5. The time of flight camera of claim 1 wherein the estimated camera range value is determined by the expression:
6. The time of flight camera of claim 1 wherein the estimated camera range value is determined by multiplying an index value associated with the frequency by the range resolution of the camera.
7. The time of flight camera of claim 1 wherein the corrected camera range value is determined by adding or subtracting from the estimated range value a correction variable defined by the expression:
8. The time of flight camera of claim 7 wherein the correction variable is added when the dataset is ordered with the lowest modulation frequency captured frame first, and the correction variable is subtracted when the data set is ordered with the highest modulation frequency captured frame first.
9. The time of flight camera of claim 1 wherein the captured data frames of the dataset are ordered prior to spectral analysis being completed to present the camera data frame captured using light modulated with the highest frequency modulation signal as the first data frame of the camera data set with each subsequent frame being captured using the next highest frequency modulation signal.
10. The time of flight camera of claim 1 wherein the data frames of the dataset are captured using light modulated with the highest frequency modulation signal as the first data frame of the camera data set with each subsequent frame being captured using the next highest frequency modulation signal.
11. The time of flight camera of claim 1 wherein the captured data frames of the dataset are ordered prior to spectral analysis being completed to present the camera data frame captured using light modulated with the lowest frequency modulation signal as the first data frame of the camera data set with each subsequent frame being captured using the next lowest frequency modulation signal.
12. The time of flight camera of claim 1 wherein the data frames of the dataset are captured using light modulated with the lowest frequency modulation signal as the first data frame of the camera data set with each subsequent frame being captured using the next lowest frequency modulation signal.
13. The time of flight camera of claim 1 which includes the additional step of validating a corrected camera range value against known harmonic artefacts and removing invalidated corrected range values from the camera output.
14. A set of computer executable instructions for a processor of a time of flight camera, said instructions executing the steps of: capturing a sequence of time of flight camera data frames using a set of step frequency modulation signals to provide a time of flight camera data set, completing a spectral analysis of the dataset which identifies frequency and phase value pairs indicative of the range of the camera to an object represented in the data frames, and determining an estimated camera range value to an object represented in the data frames using the frequency value, and determining a corrected camera range value using the estimated camera range value and the phase value, and providing a camera output which identifies the corrected range values of at least one object represented in the data frames of the dataset.
15. The set of computer executable instructions of claim 14 which includes the additional instruction step of applying a calibration to the frames of the captured data set so that the results of the spectral analysis yields a zero phase value when interpolated to a zero frequency value.
16. The set of computer executable instructions of claim 15 wherein the calibration applied specifies a rotation to be applied to a phase value associated with each modulation frequency used to capture a data frame.
17. A method of operating a time of flight camera which includes the steps of: capturing a sequence of time of flight camera data frames using a set of step frequency modulation signals to provide a time of flight camera data set, completing a spectral analysis of the data set which identifies frequency and phase value pairs indicative of the range of the camera to an object represented in the data frames, and determining an estimated camera range value to an object represented in the data frames using the frequency value, and determining a corrected camera range value using the estimated camera range value and the phase value, and providing a camera output which identifies the corrected range values of at least one object represented in the data frames of the data set.
18. The method of claim 17 which includes the additional step of applying a calibration to the frames of the captured data set so that the results of the spectral analysis yields a zero phase value when interpolated to a zero frequency value.
19. The method of claim 18 wherein the calibration applied specifies a rotation to be applied to a phase value associated with each modulation frequency used to capture a data frame.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0108] Additional and further aspects of the present invention will be apparent to the reader from the following description of embodiments, given in by way of example only, with reference to the accompanying drawings in which:
[0109]
[0110]
[0111]
[0112]
[0113]
[0114]
[0115]
[0116]
[0117] Further aspects of the invention will become apparent from the following description of the invention which is given by way of example only of particular embodiments.
BEST MODES FOR CARRYING OUT THE INVENTION
[0118]
[0119]
[0120] At step C instructions are executed to modify the source modulation signal with the subtraction of a frequency offset value to provide a stepped modulation signal.
[0121] Step D is then executed with the light source transmitting light modulated by the modulation signal generated at step C with the light sensor capturing a further camera data frame.
[0122] At step E an assessment is made of the number of data frames captured so far when compared with the number of data frames required for a complete time of flight camera data set. If the data set is incomplete the process returns to step C where the frequency of the last modulation signal used is modified with the subtraction of the frequency offset value.
[0123] Once a complete data set has been captured step F is completed to perform a spectral transformation on the captured data frames. In the embodiment shown the spectral transformation is performed using a Fourier transform.
[0124] Lastly at step G the values present in the data set are analysed to identify particular frequency and phase values which are associated with objects reflecting light to the camera sensor. Frequency values correlating with the object’s range to the camera are identified, and the phase values associated with them are used to correct or refine the range value provided by the camera for an object.
[0125] In this embodiment an estimated range for an object is calculated by identifying intensity peaks in the results of the spectral analysis associated with particular frequency, phase paired values. For a particular intensity peak ω.sub.est an estimated range is calculated from the expression:
[0126] Once this estimated range has been calculated a corrected range is the calculated at step G using the expression:
This corrected range value can then be provided as camera output to complete step G and terminates the operational method of this embodiment. In this embodiment an image is presented as a camera output, where the colour of individual pixels of this image indicate both position and corrected range values for an object in the field of view of the camera.
[0127]
[0128] Again the first step A of this operational method is executed to operate the signal oscillator to generate a source modulation signal. This modulation signal is generated with the use of a calibration which makes an adjustment to the phase of the signal so that the results of a spectral analysis yields a zero phase value when interpolated to a zero frequency value. In this embodiment the calibration applied specifies a rotation to be applied to a phase value associated with each modulation frequency used to capture a data frame.
[0129] In various additional embodiments this calibration can also be used to ensure that the phase of the modulation signal varies linearly with respect to the frequency of the modulation signal. Additional embodiments can also utilise this calibration to implement a windowing function in addition to adjustments to the phase of the modulation signal as referenced above.
[0130] Step B is then executed with the light source transmitting light modulated by the source modulation signal and the light sensor capturing a camera data frame. In this embodiment a captured camera data frame is supplied as an input to a ‘first in last out’ or FILO buffer memory structure implemented by the camera processor.
[0131] At step C instructions are executed to modify the source modulation signal with the addition of a frequency offset value to provide a stepped modulation signal. Again the same calibration use with respect to step A is used to adjust the phase of the resulting stepped modulation signal.
[0132] Step D is then executed with the light source transmitting light modulated by the modulation signal generated at step C with the light sensor capturing a further camera data frame. Again this captured data frame is supplied as the next input to the above referenced FILO buffer.
[0133] At step E an assessment is made of the number of data frames captured so far when compared with the number of data frames required for a complete time of flight camera data set. If the data set is incomplete the process returns to step C where the frequency of the last modulation signal used is modified with the addition of the frequency offset value.
[0134] At step F an ordering process is completed to compile the full set of captured data frames into a simple data set. In this embodiment the contents of the FILO buffer are read out, thereby reordering the stored data frames in the sequence provided in accordance with the invention.
[0135] Once the complete correctly ordered data set has been compiled step G is completed to perform a spectral transformation. In the embodiment shown the spectral transformation is performed using a Fourier transform.
[0136] Lastly at step H the values present in the data set are analysed to identify particular frequency and phase values which are associated with objects reflecting light to the camera sensor. Again frequency values correlating with the object’s range to the camera are identified, and the phase values associated with them are used to correct or refine the range value provided by the camera for an object. In such embodiments step H executes a similar process to that discussed with respect to step G of
[0137] Once this estimated range has been calculated a corrected range is the calculated at step H using the expression:
This corrected range value can then be provided as camera output to complete step H and terminate the operational method of this embodiment. In this embodiment camera output is provided to a machine vision system, where the format and content delivered is determined by the requirements of the receiving system.
[0138]
[0139] Raw amplitude values are captured over time as modulation frequencies are increased and
[0140] Using the operational method described with respect to
[0141]
[0142] As can be seen from
[0143]
[0144]
[0145] As can be seen from
[0146]
[0147]
[0148]
[0149]
[0150] Similarly to
[0151] In the preceding description and the following claims the word “comprise” or equivalent variations thereof is used in an inclusive sense to specify the presence of the stated feature or features. This term does not preclude the presence or addition of further features in various embodiments.
[0152] It is to be understood that the present invention is not limited to the embodiments described herein and further and additional embodiments within the spirit and scope of the invention will be apparent to the skilled reader from the examples illustrated with reference to the drawings. In particular, the invention may reside in any combination of features described herein, or may reside in alternative embodiments or combinations of these features with known equivalents to given features. Modifications and variations of the example embodiments of the invention discussed above will be apparent to those skilled in the art and may be made without departure of the scope of the invention as defined in the appended claims.