A TIME-OF-FLIGHT SENSOR SYSTEM
20230358891 · 2023-11-09
Inventors
Cpc classification
F03G7/0636
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F03G7/06143
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
F03G7/0665
MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
G01S17/42
PHYSICS
G01S17/894
PHYSICS
International classification
G01S17/894
PHYSICS
Abstract
A time-of-flight sensor system 10 comprising: an illumination source 11 for illuminating a subject 19 to which a time-of-flight is to be measured; an optical system configured to, using an at least one actuator, transition the illumination source 11 between providing spot illumination and flood illumination; and a sensor 12 comprising a sensor surface. The sensor surface is configured to sense light scattered by the subject 19 from the illumination source 12 and to provide data dependent on sensed light. The spot illumination has a spatially non-uniform intensity over the sensor surface, and the optical system is configured to move the spot illumination across at least part of the sensor surface to generate an output frame, wherein the at least one actuator comprises at least one shape memory alloy (SMA) component.
Claims
1. A time-of-flight sensor system comprising: an illumination source for illuminating a subject to which a time-of-flight is to be measured; an optical system configured to transition the illumination source between providing spot illumination and flood illumination; and a sensor comprising a sensor surface, the sensor being configured to sense light scattered by the subject from the illumination source and to provide data dependent on sensed light, wherein the spot illumination has a spatially non-uniform intensity over the sensor surface, and the optical system is configured to, using an at least one actuator, move the spot illumination across at least part of the sensor surface to generate an output frame, wherein the at least one actuator comprises at least one shape memory alloy (SMA) component.
2. The time-of-flight sensor system according to claim 1, wherein the optical system is configured to move the spot illumination in a scanning pattern across at least part of the sensor surface.
3. The time-of-flight sensor system according to claim 2, wherein the scanning pattern comprises moving the spot illumination along a first direction across at least part of the sensor surface, wherein the scanning pattern further comprises moving the spot illumination along a second direction across at least part of the sensor surface, and wherein the first direction is perpendicular to the second direction or angled to the second direction in a plane.
4-7. (canceled)
8. The time-of-flight sensor system according to claim 2, wherein the optical system is configured to focus and defocus the illumination source using the at least one actuator.
9. The time-of-flight sensor system according to claim 8, wherein the optical system is configured to focus and defocus the illumination source by moving, relative to a support structure, a moveable element between a respective first position and second position, wherein the at least one SMA component is connected between the moveable element and the support structure, the SMA component being configured to, on contraction, move the moveable element from the first position to the second position, and wherein the at least one actuator comprises a second SMA component connected between the moveable element and the support structure, the second SMA component configured to, on contraction, move the moveable element from the second position to the first position.
10-15. (canceled)
16. The time-of-flight sensor system according to claim 9, wherein the moveable element is configured to move along a movement axis and is prevented from moving beyond the first position or the second position by support portions of the support structure, and wherein the moveable element is orientated at a non-zero angle to the orientation of the support portions such that when the moveable element moves towards the first position or the second position, a first portion of the moveable element contacts the support portion before a second portion of the moveable element, causing the moveable element to be tilted about a rotation axis.
17-21. (canceled)
22. The time-of-flight sensor system according to claim 9, wherein the moveable element is a lens component.
23-31. (canceled)
32. The time-of-flight sensor system according to claim 1, wherein the optical system is configured to determine, for the illumination source, which of spot illumination or flood illumination to provide, based on one or more of a desired range of illumination, a desired resolution of the generated image, or a desired frame rate, and/or wherein the optical system determines which of spot illumination or flood illumination to provide based on a mode of operation of the time-of-flight sensor system.
33-35. (canceled)
36. The time-of-flight sensor system according to claim 1, wherein the spot illumination comprises one or more spots, and a ratio of intensity of illumination at the one or more spots to intensity of illumination between the one or more spots is 20 or greater, such as 30 or greater, and/or wherein the flood illumination comprises a ratio of intensity of illumination at spots of the flood illumination to intensity of illumination between the spots of the flood illumination of 2 or less, such as 1.5 or less.
37. (canceled)
38. The time-of-flight sensor system according to claim 1, wherein the optical system is configured to move the spot illumination in a continuous manner during which the sensor senses the scattered light, or wherein the optical system is configured to pause movement of the spot illumination at plural predetermined positions along a path of movement so as to allow the sensor to sense the scattered light at the predetermined positions, and to resume movement along the path of movement once the sensor has finished sensing the scattered light.
39-46. (canceled)
47. A method of sensing light scattered from a subject for a time-of-flight sensor system, the method comprising: determining, for an illumination source, which of flood illumination or spot illumination to provide; illuminating, by the illumination source, the subject with the determined flood illumination or spot illumination; and sensing, by a sensor, light scattered by the subject from the illumination source and providing data dependent on the sensed light, the sensor comprising a sensor surface, wherein the spot illumination has a spatially non-uniform intensity over the sensor surface, and when the illumination source provides spot illumination, moving the spot illumination across at least part of the sensor surface to generate an output frame.
48. The method according to claim 47, further comprising: pausing movement of the spot illumination at plural predetermined positions along a path of movement; sensing the scattered light at the predetermined positions, and resuming movement along the path of movement once the sensor has finished sensing the scattered light.
49-58. (canceled)
59. A time-of-flight sensor system comprising: an illumination source configured to illuminate a subject to which a time-of-flight is to be measured, wherein the illumination is spot illumination comprising one or more spots of light; a sensor comprising a sensor surface having one or more pixels, the sensor being configured to: sense one or more spots of light, scattered by the subject from the illumination source, that each illuminates a respective first area of the sensor surface, wherein each pixel of the one or more pixels has an area larger than the first area of the sensor surface illuminated by a sensed spot of light, and provide a data set dependent on the one or more sensed spots of light, the provided data set being for generation of an output frame reflecting each area of the sensor surface illuminated by the one or more sensed spots of light; and an optical system configured, relative to the illumination source and using an at least one actuator, to move the spot illumination relative to the subject such that each of the one or more sensed spots of light moves to illuminate a respective second area of the sensor surface, the distance moved by each of the one or more spots of light being less than a distance spanned by a pixel of the one or more pixels, wherein the at least one actuator comprises at least one shape memory alloy (SMA) component.
60. The time-of-flight sensor system according to claim 59, wherein the time-of-flight sensor system is configured to use the data set to generate an output frame reflecting each first area of the sensor surface illuminated by the one or more sensed spots of light.
61. The time-of-flight sensor system according to claim 60, wherein the sensor is configured to, each time the optical system engages the illumination source to move the spot illumination relative to the subject, provide a second data set dependent on the moved one or more sensed spots of light, wherein the time-of-flight sensor system is configured to generate a second output frame using each second data set, each second output frame reflecting the second areas of the sensor surface illuminated by the respective moved one or more sensed spots of light, wherein the time-of-flight sensor system is configured to combine two or more output frames to produce a final output frame, and wherein the final output frame has a resolution equal to the sum of the resolutions of the or each combined output frame.
62-64. (canceled)
65. The time-of-flight sensor system according to claim 59, wherein each respective area of the sensor surface illuminated by a sensed spot of light equal to the respective first area of the sensor surface and is within the bounds of a corresponding pixel of the one or more pixels, wherein the optical system is configured to engage the illumination source to move the spot illumination relative to the subject such that each respective second area illuminated by a moved spot of light remains within the bounds of the corresponding pixel, and wherein corresponding first and second areas do not overlap.
66-70. (canceled)
71. The time-of-flight sensor system according to claim 59, wherein the one or more pixels comprises an array of pixels and the distance spanned by a pixel comprises the pixel pitch for the pixels of the array, wherein the one or more sensed spots of light comprises a spot pattern made up of a grid of spots of light that corresponds to the array of pixels such that each sensed spot of light illuminates an area of the sensor surface that is within the bounds of a respective pixel in the array.
72. The time-of-flight sensor system according to claim 59, wherein the optical system is configured to move the spot illumination in a scanning pattern across at least part of the sensor surface.
73. The time-of-flight sensor system according to claim 72 wherein, the scanning pattern comprises moving the spot illumination along a first direction across at least part of the sensor surface, wherein the scanning pattern further comprises moving the spot illumination along a second direction across at least part of the sensor surface, and wherein the first direction is perpendicular to the second direction or angled to the second direction in a plane.
74-75. (canceled)
76. The time-of-flight sensor system according to claim 59, wherein the optical system comprises a diffraction element for providing one or more spots of light, wherein the optical system comprises a lens for focusing the one or more spots of light onto the subject, and wherein the optical system is configured to move the spot illumination relative to the source by the actuator engaging the lens to move the lens relative to the illumination source to move the spot illumination relative to the subject.
77-91. (canceled)
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0072] Certain embodiments of the presently-claimed invention will now be described, by way of example only, with reference to the accompanying drawings, in which:
[0073]
[0074]
[0075]
[0076]
[0077]
[0078]
[0079]
[0080]
[0081]
[0082]
[0083]
[0084]
[0085]
[0086]
[0087]
[0088]
[0089]
[0090] Like features are denoted by like reference numerals.
DETAILED DESCRIPTION
[0091] An example time-of-flight sensor system will now be described with reference to
[0092]
[0093] The system 10 also comprises a sensor 12 comprising a sensor surface. The sensor 12 is configured to sense light scattered by the subject 19 from the VCSEL array 11 and to provide data dependent on the sensed light.
[0094] The example time-of-flight system 10 illustrated in
[0095] Light 17 is emitted from the VCSEL array 11 and is focused by the lens element 15 onto the subject 19. The spot illumination provided by the VCSEL array 11 has a spatially non-uniform intensity over the sensor surface.
[0096]
[0097] The lens element 15 may comprise one or more lenses whose optical axes lie parallel to the z-direction as shown in
[0098] Additionally, the spot illumination may scan across the subject 19 by shifting the lens element 15 along the x-y plane perpendicular to the z-axis. The use of eight SMA actuator wires 2 for permitting the said lateral movement is described in international patent publication numbers WO2011/104518, WO2012/066285, WO2014/076463, and WO2017/098249.
[0099] The control system may set target SMA wire resistance values for all SMA wires 2 and that correspond to the desired position of the lens element. Closed-loop feedback control may be performed using the target SMA wire resistance values with the real-time SMA wire resistance measures to set the electrical drive power through the SMA wires 2 in real-time. The target values of SMA wire resistance values set by the control system can be extracted from a look-up table of pre-determined calibrated values stored inside the memory of the control system. These pre-determined values can be determined during manufacture or during a start-up procedure performed after every initialisation of the SMA actuator or a combination of both.
[0100] The spot illumination used in this example is illustrated in
[0101] The optical system is configured to move the spot illumination across at least part of the sensor surface to generate an output frame. The movement of the spot illumination is caused by an at least one actuator, which in this example is a single actuator (not shown). The actuator moves the spot illumination in a scanning pattern across at least part of the sensor surface. In this example, the scanning pattern 20 comprises moving the spot illumination in a first direction, a second direction, and a third direction, where the first and third directions are parallel and are perpendicular to the second direction. The scanning pattern 20 optionally comprises a forth direction to return the spot illumination to its starting position. Movement in the three (or four) directions makes up a scanning cycle of the scanning pattern.
[0102] The set of regions 18 are arranged such that the movement of the spot illumination causes the set of regions 18 to cover more than 90% of the sensor surface during a cycle of the scanning pattern. The set of regions 18 are also arranged such that the movement of the spot illumination substantially avoids regions of the set of regions 18 covering the same part of the sensor surface more than once during a cycle of the scanning pattern.
[0103] In the same time-of-flight sensor system 10, the system 10 is capable of adapting to different applications. While in this example the system 10 provides spot illumination including dotscan sensing (moving the spot illumination across at least part of the sensor surface), the system 10 is equally capable of providing flood illumination, by moving the lens component to the second position 14.
[0104]
[0105] In one embodiment according to the present invention, the spot illumination moves sequentially across each of the regions 50a-50i by a continuous motion. Therefore, as shown in
[0106] During the continuous movement of the spot illumination, the sensor 12 continuously senses scattered light, and therefore, the sensor 12 may sense scattered light when the spot illumination is moving between the regions 50a-50i, e.g. at positions where the spot illumination may not be perfectly aligned at the regions 50a-50i. Therefore, such techniques may have a drawback in that the sensor 12 may sense a substantial amount of scatted light originated from noise sources other than the VCSEL array 11 (in some cases the scattered light may originate solely from noise sources) for a significant portion of sensing time or exposure time, e.g. up of half of the exposure time, which may negatively impact the accuracy of the time-of-flight measurement.
[0107] Alternatively, in another embodiment according to the present invention, the movement of the spot illumination pauses at each of the regions 50a-50i along a path of movement, so as to allow the sensor 12 to sense the scattered light at each of the regions 50a-50i, as illustrated in the representation in
[0108] Since the scattered light is sensed when the spot illumination is substantially aligned with the target regions, the signal-to-noise ratio in the sensed scatter light is significantly improved in comparison to the previous embodiment. The spot illumination resumes movement once the sensor has finished sensing the scattered light. That is, the spot illumination moves sequentially across each of the regions 50a-50f, whilst pausing at each of the regions 50a-50f.
[0109] In some embodiments, the sensor 12 is configured to sense scattered light only when the spot illumination has moved to, and being held stationary at, the regions 50a-50i. Thus, the sensor 12 and the actuation apparatus 70 are synchronised so that the sensor 12 operates only when the movement of the spot illumination ceases at each of the regions 50a-50i.
[0110] In other embodiments, the sensor 12 is configured to sense scattered light in a continuous manner, and wherein only the scattered light that is sensed when the spot illumination has moved to, and held stationary at, the regions 50a-50i is used for generating an output frame. For example, the sensor 12 may sense scattered light even when the spot illumination is not aligned. However, such data may be discarded and not used for generating an output frame. Such an arrangement may reduce the degree of synchronisation required between the sensor 12 and the actuation apparatus 70.
[0111]
[0112]
[0113]
[0114] The lens component 15 is orientated at a non-zero angle to the orientation of the endstops 21, 23, such that when the lens component 15 moves between the second position and the first position a first portion 27 of the lens component 15 contacts the endstop 21 before a second portion of the lens component 15, causing the lens component 15 to be tilted about a rotation axis.
[0115]
[0116]
[0117] The endstops 21, 23 prevent the lens component 15 from moving beyond the first position and the second position. That is, they provide limits on the movement of the lens component 15. On contraction of the SMA wire, the lens component 15 is moved position but cannot move beyond the limits of the endpoints 21, 23. Thus, the SMA wire may be configured to contract quickly, thereby moving the lens component quickly. For example, the lens component 15 may be moved between the second position and the first position in less than 3 ms.
[0118]
[0119] The actuation apparatus 22 also comprises another SMA wire 33. The SMA wire 33 is also in connection with the lens component 15, and is also in connection with endstop 23 (not shown in top view). The SMA wire 33 is configured to, on contraction, move the lens component 15 from the first position to the second position. That is, the SMA wire 33 is configured to defocus the illumination source such that the illumination source provides flood illumination. The movement of the lens component 15 from the first position to the second position works in a similar and corresponding to the movement of the lens component from the second position to the first position.
[0120]
[0121] At
[0122] At
[0123] An example time-of-flight sensor system 100 and a method of sensing light scattered from a subject for a time-of-flight sensor system will now be described with reference to
[0124] The time-of-flight sensor system 100 has an illumination source 102; an optical system 106 and a sensor 108. The illumination source 102 is for illuminating the subject 104 to which a time-of-flight is to be measured. In this example, the illumination source 102 is a dot projector formed by a vertical-cavity surface-emitting laser (VCSEL) array, but other appropriate illumination sources may be used. The optical system is located between the illumination source and the subject.
[0125] The optical system 106 comprises an optical assembly or element with a lens 110, here a collimating lens, that focusses the illumination source 102, an actuator 122 and a diffraction element in the form of a diffraction grating 126. The lens 110 may comprise a plurality of lenses 100 provided along an optical axis, or it may be a microlens array comprising plural microlens positioned substantially across the same plane along the optical axis.
[0126] The lens 110 is movable with respect to the subject 104. For example, the lens can be moved laterally—“side to side”—with respect to the subject along an axis indicated by arrows 112. The lens may also be moved along the other two axes of the three dimensional scene—towards/away from the subject, and vertically or “up and down”—relative to the subject, depending on the use case and the range of motion needed, e.g. switching between spot illumination and flood illumination or adjusting the spot size in spot illumination. The lateral and axial movements may be effected by a single actuator, or by different actuators. This movement may be achieved by sliding the lens along the relevant axis, tilting the lens, or with the use of a steering motor.
[0127] The lens 110 interacts with the light source 102 to produce a pattern in the three dimensional scene that matches the pattern of the cavities of the VCSEL array. This pattern provided by the light source 102 is a light pattern of non-uniform intensity, which can be described as spot illumination. In this example, there is a ratio of intensity of illumination at the spots to intensity of illumination between the spots of 30, e.g. in a condition where there is no background illumination or interference. However, the ratio may be, for example, 20 or greater, or 30 or greater in a condition where there is no background illumination or interference.
[0128] The actuator 122, in this example, is a shape memory alloy (SMA) actuator comprising one or more SMA components for driving movement in the lens 110. Another type of actuator may be used, such as voice coil motor (VCM) or voice coil actuator, or a microelectromechanical systems (MEMS) magnetic actuator. The actuator 122 engages the lens 110 to move the lens as described above, e.g. substantially along the axis 112. The actuator includes an interface 124 to a processor or controller 125, in particular, a computer controller.
[0129] The diffraction element, or the diffraction grating 126 is located between the lens 110 and the subject 104. The diffraction grating diffracts light from the lens and, in particular, collimated light from the lens to provide more spots on the subject. This increases the resolution, which is dependent on the number of spots projected.
[0130] Artificial light or illumination 128 from the illumination source 102 forms reflected or scattered light from the subject 104 to the sensor 108 of the time-of-flight sensor system 100. The sensor senses the light scattered by the subject from the illumination source 102.
[0131] The system processor or controller 125 is in communication connection with the illumination source 102, the actuator 122 and the sensor 108 to control them, as well as receiving and processing data from them. The controller may be implemented in hardware or in software as a computer program stored in a non-transitory computer-readable medium, which, in this example, is memory of the device on which the time-of-flight sensor system is located, in this example, a smartphone. Although the system controller 125 is shown as a single unit, it may comprise separate modules or units, which are in communication with one another. For example, the system controller 125 may comprise an actuator control unit, a sensor control unit, a light source control unit and an image processor. Alternatively, instead of a single integrated controller comprising each unit, the separate units may be arranged within their respective components. For example, the actuator 122 may itself comprise an actuator controller, the sensor may comprise a sensor controller, the light source may comprise a light source controller. There may also be a separate imager processor.
[0132] A scanning pattern of the spot illumination is achieved by the controller 125 controlling the actuator 122 to move the lens 110. The lens 110 directs the illumination source 102 towards the subject 104. The particular positioning of the spot pattern in relation to the subject is controlled by the controller 125 controlling the actuator 122. The controller 125 is able to control the actuator 122 to move the lens 110, which moves the spot pattern relative to the subject.
[0133] The spot illumination used in this example is illustrated in
[0134] As shown in
[0135] The movement depends on the type of actuator used; they have different characteristics. For example, VCMs, for example, have a higher maximum speed, but a characteristic ringing of a longer decaying oscillation around the target position to most quickly decelerate to the target position.
[0136] As shown in
[0137] Thus, as the spot pattern is moved relative to the subject 104—by engagement of the actuator 122 on the lens 110—the spot pattern correspondingly moves across the sensor surface 302, as shown by the dashed arrows.
[0138] The sensor surface 302 comprises a number of pixels 304—e.g. one or more pixels. In the present embodiment, relatively large pixels are used. In particular, a sensor with approximately 10,000 pixels are used (although other sensors, such as 300,000 pixel sensors, may be used). As discussed previously, larger pixels are superior at collecting incident light compared to smaller pixels. However, fewer pixels used results in lower resolution output frames produced from the sensor.
[0139] The time of flight sensor system is configured to provide a spot pattern comprising a grid of spots. The area of each pixel 304 of the sensor surface 302 is larger than the area of the sensor surface illuminated by a spot 201 of the grid of spots. Further, each spot is contained within the bounds of a pixel 304 and the number of spots in the grid of spots corresponds to the number of pixels 304 making up the sensor surface 302. In other words, the grid of spots is aligned with the array of pixels such that each pixel contains a spot, as shown in
[0140] For example, a spot pattern may comprise a single spot contained within a pixel. Alternatively, the spot pattern may comprise one or more spots in which a number of the spots cross pixel boundaries.
[0141] The relative calibration of the light source 102, optical system 106 and sensor 108 to ensure correct placement of the spots one the sensor surface 302—for example, correct alignment of individual spots with a respective pixel—may be performed when the system is assembled, for example, using active alignment or other appropriate method.
[0142] When the spot pattern is incident on the sensor surface 302, the data for each pixel 304 is read out by the sensor 108 and provided to the processor 125 for image processing. The data for each pixel 304 provides information on the illumination received by that pixel. As discussed above, if a pixel is not illuminated at all, it can provide no meaningful information about the subject. However, here, each pixel is illuminated by a spot, so every pixel contributes useful information.
[0143] The information is used by the processor 125 to form an output frame—in particular, a depth map—that provides time of flight information (e.g. depth information) regarding the subject in the scene. The output frame comprises frame pixels, each frame pixel reflecting the illumination data captured by the corresponding pixel on the sensor surface 302.
[0144] Once a first set of data, which reflects the areas of the sensor surface illuminated by the one or more sensed spots of light, has been passed to the processor, the spot pattern is moved relative to the subject such that each of the one or more sensed spots of light illuminates a second respective area of the sensor surface. The distance moved by each of the one or more spots of light is less than a distance spanned by a pixel of the one or more pixels. The distance spanned by a pixel across the sensor surface sometimes referred to as the pixel pitch. In the present embodiment, the spots are moved within their respective pixels. In other words, the actuator 122 precisely adjusts the lens 110 to move each spot 201 across the sensor surface 302 within the bounds of its respective pixel 304. More specifically, the spot pattern is shifted so that each spot moves from a top left corner of their respective pixel to top right hand corner of their respective pixel. Here, the distance moved is smaller than the pixel pitch, but far enough so that the first and second areas illuminated by the pixels do not overlap. As can be seen in
[0145] A second set of data, which reflects the second set of areas illuminated by the moved spots, is passed to the processor to produce a second output frame.
[0146] This process may be repeated a number of times to produce further output frames. For example, as shown by the dashed arrows of
[0147] In more general terms, the spots of light comprise a set of regions in which the peak emitted intensity is substantially constant and/or in which the peak emitted intensity is at least 50% of a maximum intensity level. The set of regions together may cover between 1% and 50% of the sensor surface at a given instant of time, depending on the size of a spot relative to the sensor surface, and relative to its corresponding pixel. For example, the set of regions together may cover more than 10% and less than 50% or less than 40% or less than 30% or less than 20% of the sensor surface at a given instant of time, or the set of regions together cover more than 20% and less than 50% or less than 40% or less than 30% of the sensor surface at a given instant of time, or the set of regions together cover more than 30% and less than 50% or less than 40% of the sensor surface at a given instant of time, and optionally wherein the set of regions together cover more than 40% and less than 50% of the sensor surface at a given instant of time. Further, the set of regions are arranged such that the movement of the spot illumination causes the set of regions to cover more than 75% or more than 90% or substantially all of the sensor surface during a cycle of a scanning pattern, i.e. as the spots are moved relative to, or within, their respective pixels.
[0148] The controller 125 may determine where a spot is within a given pixel across each scan cycle (i.e. provide a spot pattern on the sensor surface, provide sensed data, and move pattern to a new position) by appropriate calibration techniques. For example, during a start-up phase, the system may go through a process in which a zero position is determined which corresponds to the spots being in certain places on certain pixels. For example, considering a single pixel and a single spot, the actuator may move the spot to the left/right, and observe and determine the position at which the spot is no longer on the pixel. This would give the actuator position required for the spot to be on the left/right hand edge of the pixel. This can be repeated for the up/down direction. Once a known position is found then the actuator control can define the position of the spot on the pixel. I.e. it can be determined that by moving the actuator 10 um (for example) the spot (or spots) move over half a pixel width.
[0149] Once a number of output frames have been produced—here, four frames—the processor combines these frames to provide a final, super-resolution (high resolution), frame. This is shown in
[0150] The processor determines how to combine the output frames from the data received from the actuator and the sensor. For example, the initial spot position, along with the spot spacing and the distances the spots move across the scan cycle, as determined by the system, may be communicated to the processor. The processor then uses this information to build the super resolution frame. This information may be captured each time a frame is generated and passed to the processor as meta data alongside the pixel information. Alternatively, in some arrangements, some or all of this information may not be necessary for the processor to build the super resolution frame. For example, algorithms may be employed to combine the frames without prior knowledge on spot position.
[0151] As shown in
[0152] The super resolution frame provides a high resolution depth map for TOF calculations. By combining the output frames, the resolution is increased fourfold without having to decrease pixel size. Further, by providing sub-pixel size spots, and moving the spots within the bounds of their respective pixels during scanning, every pixel of the sensor face is contributing useful pixel data during frame generation. This improves efficiency.
[0153] Embodiments of the present invention have been described. It will be appreciated that variations and modifications may be made to the described embodiments within the scope of the present invention. For example, the example time-of-flight sensor system is described as located in a smartphone, however, the time-of-flight sensor system may be located in a computer, such as a laptop computer, tablet computer or desktop computer, in a vehicle, or other consumer device.
[0154] Further, although the actuator has been described as moving the spot illumination by engaging with a collimation lens, it will be appreciated that a different lens may be used. Alternatively, or in addition, the actuator may engage directly with light source to move the spot illumination, or may engage with the diffraction element/grating, or may move the lens and the diffraction element/grating together relative to the light source to move the spot illumination relative to the subject. In other arrangements, a diffraction element grating may not be used at all.
[0155] Embodiments of the present invention have been described. It will be appreciated that variations and modifications may be made to the described embodiments within the scope of the present invention. For example, the example time-of-flight sensor system is described as located in a smartphone device, however, the time-of-flight sensor system may be located in a computer, such as a laptop computer, tablet computer, or desktop computer, in a vehicle, or other consumer device.