Method of setting a focus to acquire images of a moving object and corresponding device
10187563 ยท 2019-01-22
Assignee
Inventors
Cpc classification
G01S17/58
PHYSICS
G01S7/08
PHYSICS
G02B7/40
PHYSICS
H04N23/663
ELECTRICITY
International classification
G01S17/58
PHYSICS
G01S7/08
PHYSICS
Abstract
At least one image of a moving object is acquired using an image acquisition device equipped with an automatic focussing system. A distance between the object and the device when the effective acquisition of the image occurs is estimated based on the estimated speed and on the period of time separating a time of actuation triggering the process for acquiring the at least one image from the time of acquisition of the said effective acquisition, and the taking into account of the said distance by the automatic focussing system.
Claims
1. A method, comprising: measuring a plurality of distances using at least one time-of-flight sensor of an image acquisition device; estimating, using the image acquisition device and during an image acquisition cycle of the image acquisition device, a speed of a moving object based on the plurality of distances measured using the at least one time-of-flight sensor of the image acquisition device, wherein an acquisition time of at least one of the measured plurality of distances precedes a time of activation of the image acquisition cycle; estimating, by the image acquisition device and during the image acquisition cycle, a distance between the image acquisition device and the moving object at an effective image acquisition time based on the estimated speed of the moving object and a period of time between the time of activation of the image acquisition cycle of the image acquisition device and the effective image acquisition time; setting, by the image acquisition device and during the image acquisition cycle, a focus of the image acquisition device based on the estimated distance; and acquiring, by the image acquisition device and during the image acquisition cycle, an image of the moving object at the effective acquisition time using the focus set by the image acquisition device.
2. The method of claim 1, comprising acquiring, using the image acquisition device, a plurality of images of the moving object.
3. The method of claim 1, comprising: responding to a command to auto-focus the image acquisition device by estimating a speed of a moving object.
4. The method of claim 1 wherein the setting the focus of the image acquisition device comprises setting a position of at least one lens of objective optics of the image acquisition device.
5. The method of claim 1 wherein the period of time is a constant.
6. The method of claim 1 wherein the image acquisition device comprises objective optics and a motor, the motor having a plurality of states of progression each corresponding to a focusing range of the objective optics.
7. The method of claim 6 wherein the at least one time-of-flight sensor has a maximum range of detection and the maximum range of detection of the at least one time-of-flight sensor is at least 65% of a maximum focusing range of the objective optics.
8. A device, comprising: one or more inputs and one or more outputs; and circuitry, coupled to at least one of the one or more inputs and to at least one of the one or more outputs, and which, during an image acquisition cycle: estimates a speed of a moving object with respect to the device based on a plurality of distances measured by at least one time-of-flight sensor, wherein an acquisition time of at least one of the measured plurality of distances precedes a time of activation of the image acquisition cycle; estimates a distance between the device and the moving object at an effective image acquisition time based on the estimated speed of the moving object and a period of time between the time of activation of the image acquisition cycle and the effective image acquisition time; and controls acquisition of an image of the moving object at the effective image acquisition time, the controlling including setting a focus to acquire the image of the moving object at the effective image acquisition time based on the estimated distance.
9. The device of claim 8, comprising: objective optics including at least one lens, wherein the circuitry sets the focus by outputting a control signal to set a position of the at least one lens.
10. The device of claim 9, comprising: image acquisition circuitry, which, in one mode of operation, acquires images of moving objects; and an actuator, which, in operation, actuates an image acquisition cycle of the image acquisition circuitry.
11. The device of claim 10, comprising the at least one time-of-flight sensor.
12. The device according to claim 11 wherein the at least one lens comprises a first field of view and the at least one time-of-flight sensor comprises a second field of view covering at least one third of the first field of view.
13. The device according to claim 11 wherein the image acquisition circuitry comprises a motor having a plurality of states of progression each corresponding to a focusing range of the objective optics.
14. The device of claim 13 wherein the at least one time-of-flight sensor has a maximum range of detection and the maximum range of detection of the at least one time-of-flight sensor is at least 65% of a maximum focusing range of the objective optics.
15. The device of claim 8, comprising at least one of: a touch screen; and mobile telephone circuitry.
16. A system, comprising: at least one time-of-flight sensor; image acquisition circuitry, which, in operation, acquires images of objects; objective optics including at least one lens; and auto-focus circuitry, which, in operation: measures a plurality of distances using the at least one time-of-flight sensor; and, during an image acquisition cycle, estimates a speed of a moving object relative to the objective optics based on the plurality of distances measured by the at least one time-of-flight sensor, wherein an acquisition time of at least one of the measured plurality of distances precedes a time of activation of the image acquisition cycle; estimates a distance between the objective optics and the moving object at an effective image acquisition time based on the estimated speed of the moving object and a period of time between the time of activation of the image acquisition cycle of the image acquisition circuitry and the effective image acquisition time; and sets a focus of the at least one lens based on the estimated distance.
17. The system of claim 16, comprising at least one of: a touch screen; and mobile telephone circuitry.
18. The system of claim 16, wherein the auto-focusing circuitry comprises a motor, the motor having a plurality of states of progression each corresponding to a focusing range of the objective optics.
19. The system of claim 18 wherein the at least one time-of-flight sensor has a maximum range of detection and the maximum range of detection of the at least one time-of-flight sensor is at least 65% of a maximum focusing range of the objective optics.
20. A non-transitory computer-readable memory medium containing contents which when executed by an image acquisition device cause the image acquisition device to perform a method, the method comprising: measuring a plurality of distances using at least one time-of-flight sensor; estimating, during an image acquisition cycle of the image acquisition device, a speed of a moving object in a field of view of the image acquisition device based on the plurality of distances measured by the at least one time-of-flight sensor of the image acquisition device, wherein an acquisition time of at least one of the measured plurality of distances precedes a time of activation of the image acquisition cycle; estimating, during the image acquisition cycle, a distance between the image acquisition device and the moving object at an effective image acquisition time based on the estimated speed of the moving object and a period of time between the time of activation of the image acquisition cycle of the image acquisition device and the effective image acquisition time; setting, during the image acquisition cycle, a focus of the image acquisition device based on the estimated distance; and acquiring, during the image acquisition cycle, an image of the moving object at the effective acquisition time using the focus set by the image acquisition device.
21. The non-transitory computer-readable memory medium of claim 20 wherein the setting the focus of the image acquisition device comprises setting a position of at least one lens of objective optics of the image acquisition device.
22. The non-transitory computer-readable memory medium of claim 21 wherein the image acquisition device comprises a motor having a plurality of states of progression each corresponding to a focusing range of the objective optics.
23. The non-transitory computer-readable memory medium of claim 22 wherein the at least one time-of-flight sensor has a maximum range of detection and the maximum range of the at least one time-of-flight sensor is at least 65% of a maximum focusing range of the objective optics.
Description
BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS
(1) One or more embodiments will now be described, purely by way of non-limiting example, with reference to the annexed figures, wherein:
(2)
(3)
(4)
(5)
DETAILED DESCRIPTION
(6) In the ensuing description, numerous specific details are provided in order to facilitate as much as possible understanding of the embodiments provided by way of example. The embodiments may be implemented with or without specific details, or else with other methods, components, materials, etc. In other cases, structures, materials, or operations that are well known are not shown or described in detail so that aspects of the embodiments will not be obscured. Reference in the framework of the present description to an embodiment or one embodiment means that a given peculiarity, structure, or characteristic described in connection with the embodiment is comprised in at least one embodiment. Hence, recurrence of phrases such as in an embodiment or in one embodiment in various points of the present description does not necessarily refer to one and the same embodiment. Moreover, the peculiarities, structures, or characteristics may be combined in any convenient way in one or more embodiments.
(7) The notations and references are here provided only for convenience of the reader and do not define the scope or the meaning of the embodiments.
(8)
(9) The image acquisition device ACI comprises objective optics OBF including at least one lens L. For an improved optical performance, a plurality of different lenses may be used.
(10) The image acquisition device ACI furthermore comprises an image acquisition circuit or circuitry MCI, for example a matrix of pixels associated with a microcontroller, a triggering device MD, which, in operation, is actuated (for example a push button or else an icon on the screen of the camera function of a cellular mobile telephone or of a tablet) allowing the said acquisition circuit MCI to be activated and an automatic focusing system MPA configured to control the positioning of the said at least one lens.
(11) The focusing system MPA may be activated continuously and/or in response to the actuation of the triggering device MD depending on the configuration of the acquisition device ACI. An actuation of the triggering device MD causes the activation of the said acquisition circuit MCI, and leads, in this regard, to
(12) determining an estimated distance D between at least one object OBT, of which it is desired to acquire at least one image, and the objective optics OBF at the time of the effective acquisition of the image,
(13) controlling the focusing positioning of the said at least one lens L as a function of the distance D having been determined in such a manner as to carry out the acquisition of the image of the object OBT with an improved focusing, notably when the object is moving.
(14) The image acquisition device ACI furthermore comprises a controller MC. The controller MC comprises an estimation block or circuitry ME configured to estimate the speed of the moving object and a calculation block or circuitry MCAL configured to perform a calculation of distances.
(15) The blocks MC, ME and MCAL may be implemented in whole or in part as software modules incorporated within the microcontroller.
(16) Furthermore, the estimation block ME comprises at least one sensor CAP based on the time-of-flight principle and having a maximum range of detection.
(17) The structure and the operation of such a sensor are well-known to those skilled in the art.
(18) The sensor CAP is configured to emit a light beam towards an object situated within the said maximum range of detection and calculates the return travel time of the beam between the sensor and the said at least one object. The time-of-flight of the said light beam is proportional to the distance between the sensor and the said at least one object.
(19) The electronic device AE may comprise one or more processors P, one or more memories M, and discrete circuitry DC, which may be employed alone and in various combinations to implement the functionality of the electronic device AE. The electronic device AE also may comprise a bus system BUS to couple various inputs and outputs of the functional blocks of the electronic device together, for example, to couple outputs of the sensor CAP to inputs of the estimation block ME, to couple outputs of the controller MC to inputs of the focusing system MPA, etc.
(20) Reference is now made to
(21) The sensor CAP is configured to determine distances between the object OBT and the sensor (or the objective optics OBF) at a certain frequency, for example 30 Hz, which allows the estimation block ME to estimate the distance between the sensor and the said at least one object at various detection times, for example T1, T2, and to track the variation of this distance virtually in real time.
(22) When, at the time of actuation T.sub.a, the triggering device MD is actuated so as to send an image acquisition command, the estimation block ME is configured to estimate the speed of the object V.sub.OBT from distances between the object OBJ and the objective optics OBF determined by the said sensor CAP at times at least one of which precedes the time of actuation T.sub.a.
(23) By way of example, the times T2 and T.sub.a will be used, however the times T1 and T2 could have been used.
(24) The speed of the object V.sub.OBT may then be determined using the formula below:
(25)
(26) It should be noted that the speed of the object V.sub.OBT corresponds to the speed along the optical axis of the objective optics OBJ.
(27) The calculation block MCAL is configured to determine, using the said estimated speed V.sub.OBT and the period of time separating the time of actuation T.sub.a from the time T.sub.c of effective acquisition, a known period since it depends on the characteristics of the device.
(28) The distance D.sub.c between the object OBT and the objective optics OBJ at the said acquisition time T.sub.c can therefore be determined by the calculation block MCAL by applying the following formula:
(29)
(30) The automatic focusing system MPA controls the positioning of the said at least one lens L taking into account the said distance D.sub.c having been determined.
(31) An embodiment facilitates improving the sharpness of the acquired image, for example, when the object OBT is moving at speed.
(32) It should be noted that the said at least one lens L comprises a first field of view CV1 and the said at least one sensor comprises a second field of view CV2, as illustrated in
(33) In order to increase the chances of success of the automatic focusing in an embodiment during the acquisition of the said at least one moving object OBT, the second field of view CV2 may be placed in the middle of the first field of view CV1 (
(34) In an embodiment, the second field of view CV2 may cover at least one third of the said first field of view CV1.
(35) The image acquisition circuit MCI may comprise a motor MP for driving and controlling the progression of the said at least one lens L in such a manner as to implement the automatic focusing taking into account the said distance D.sub.c between the object OBT and the objective optics OBF at the said acquisition time T.sub.c.
(36) The said motor MP may be, for example, a mobile coil motor commonly known by those skilled in the art under the acronym VCM (for Voice Coil Motor) and comprising a plurality of states of progression EA.
(37) Each state of progression EA corresponds to a focusing range P of the objective optics OBF. The density of distribution of the states of progression may not uniform as a function of the said range P. The closer the object to the objective optics OBF, the higher the density of distribution of the states of progression that may be employed (
(38) An embodiment facilitates a high level of performance of the automatic focusing by employing a sensor with the maximum range D.sub.cmax equal to at least 65% of the maximum range P.sub.max of the objective optics OBF.
(39) Some embodiments may take the form of or include computer program products. For example, according to one embodiment there is provided a computer readable medium including a computer program adapted to perform one or more of the methods or functions described above. The medium may be a physical storage medium such as for example a Read Only Memory (ROM) chip, or a disk such as a Digital Versatile Disk (DVD-ROM), Compact Disk (CD-ROM), a hard disk, a memory, a network, or a portable media article to be read by an appropriate drive or via an appropriate connection, including as encoded in one or more barcodes or other related codes stored on one or more such computer-readable mediums and being readable by an appropriate reader device.
(40) Furthermore, in some embodiments, some of the systems and/or modules and/or circuits and/or blocks may be implemented or provided in other manners, such as at least partially in firmware and/or hardware, including, but not limited to, one or more application-specific integrated circuits (ASICs), digital signal processors, discrete circuitry, logic gates, standard integrated circuits, state machines, look-up tables, controllers (e.g., by executing appropriate instructions, and including microcontrollers and/or embedded controllers), field-programmable gate arrays (FPGAs), complex programmable logic devices (CPLDs), etc., as well as devices that employ RFID technology, and various combinations thereof.
(41) The various embodiments described above can be combined to provide further embodiments. Aspects of the embodiments can be modified, if necessary to employ concepts of the various patents, applications and publications to provide yet further embodiments.
(42) These and other changes can be made to the embodiments in light of the above-detailed description. In general, in the following claims, the terms used should not be construed to limit the claims to the specific embodiments disclosed in the specification and the claims, but should be construed to include all possible embodiments along with the full scope of equivalents to which such claims are entitled. Accordingly, the claims are not limited by the disclosure.