Apparatus and method with imaging reconstruction
11653099 · 2023-05-16
Assignee
Inventors
Cpc classification
H04N23/74
ELECTRICITY
International classification
Abstract
A processor-implemented method with image reconstruction includes: acquiring information indicating an amount of ambient light in accordance with a shutter exposure time of a camera; generating an ambient light pattern based on the information about the amount of ambient light; generating a compensation pattern which compensates for a invertibility of an external illumination pattern based on the ambient light pattern; controlling an operation of an external illumination based on the compensation pattern to acquire a photographed image by a camera; and reconstructing a latent image of the photographed image in the acquired photographed image based on the compensation pattern.
Claims
1. A processor-implemented method with image reconstruction, comprising: acquiring information indicating an amount of ambient light in accordance with a shutter exposure time of a camera; generating an ambient light pattern based on the information about the amount of ambient light; generating a compensation pattern which compensates for a invertibility of an external illumination pattern based on the ambient light pattern; controlling an operation of an external illumination based on the compensation pattern to acquire a photographed image by a camera; and reconstructing a latent image of the photographed image in the acquired photographed image based on the compensation pattern.
2. The method of claim 1, wherein the generating of the compensation pattern comprises generating a compensation pattern corresponding to each color filter of the camera.
3. The method of claim 1, wherein the generating of the compensation pattern comprises generating, by a joint point spread function being generated in the photographed image based on the compensation pattern, a compensation pattern in which a dispersion of a frequency domain function of the point spread function of the joint is small and a minimum value of a signal is large.
4. The method of claim 1, wherein the controlling of the operation of the external illumination to acquire the photographed image by the camera comprises applying the compensation pattern corresponding to different light sources of the external illumination to control the operation of the external illumination.
5. The method of claim 1, wherein the controlling of the operation of the external illumination to acquire the photographed image by the camera comprises synchronizing a shutter exposure timing of the camera and a timing to apply the compensation pattern of the external illumination.
6. The method of claim 1, wherein the reconstructing of the photographed image comprises: acquiring a background image photographed before a dynamic object appears; acquiring an image in which the dynamic object appears, by a camera which is synchronized with the external illumination to which the compensation pattern is applied; separating a dynamic object and a background from an image in which the dynamic object appears; reconstructing the separated dynamic object using the compensation pattern; and composing the reconstructed dynamic object with the background image.
7. A non-transitory computer-readable storage medium storing instructions that, when executed by one or more processors, configure the one or more processors to perform the method of claim 1.
8. An apparatus with image reconstruction, comprising: one or more processors configured to: acquire information about an amount of ambient light in accordance with a shutter exposure time of a camera; generate an ambient light pattern based on the information about an amount of ambient light; generate a compensation pattern which compensates for a invertibility of an external illumination pattern based on the ambient light pattern; control an operation of an external illumination and the camera based on the compensation pattern to acquire a photographed image; and reconstruct a latent image of the photographed image in the acquired photographed image based on the compensation pattern.
9. The apparatus of claim 8, wherein, for the generating of the compensation pattern, the one or more processors are configured to generate a compensation pattern corresponding to each color filter of the camera.
10. The apparatus of claim 8, wherein, for the generating of the compensation pattern, the one or more processors are configured to generate, by a joint point spread function being generated in the photographed image based on the compensation pattern, a compensation pattern in which a dispersion of a frequency domain function of the point spread function of the joint is small and a minimum value of a signal is large.
11. The apparatus of claim 8, wherein, for the controlling of the operation of the external illumination, the one or more processors are configured to apply the compensation pattern corresponding to different light sources of the external illumination to control the operation of the external illumination.
12. The apparatus of claim 8, wherein, for the controlling of the operation of the external illumination, the one or more processors are configured to synchronize a shutter exposure timing of the camera and a timing to apply the compensation pattern of the external illumination.
13. The apparatus of claim 8, wherein, for the reconstructing of the photographed image, the one or more processors are configured to: acquire a background image photographed before a dynamic object appears; acquire an image in which the dynamic object appears by a camera synchronized with the external illumination to which the compensation pattern is applied; separate a dynamic object and a background from an image in which the dynamic object appears; and reconstruct the separated dynamic object based on the compensation pattern to be composed with the background image.
14. The apparatus of claim 8, further comprising: the camera; the external illumination comprising one or more light sources; and a sensor comprising either one or both of an actinometer and a spectrometer, and configured to generate the information about the amount of ambient light.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1)
(2)
(3)
(4)
(5)
(6)
(7) Throughout the drawings and the detailed description, unless otherwise described or provided, the same drawing reference numerals will be understood to refer to the same elements, features, and structures. The drawings may not be to scale, and the relative size, proportions, and depiction of elements in the drawings may be exaggerated for clarity, illustration, and convenience
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT
(8) The following detailed description is provided to assist the reader in gaining a comprehensive understanding of the methods, apparatuses, and/or systems described herein. However, various changes, modifications, and equivalents of the methods, apparatuses, and/or systems described herein will be apparent after an understanding of the disclosure of this application. For example, the sequences of operations described herein are merely examples, and are not limited to those set forth herein, but may be changed as will be apparent after an understanding of the disclosure of this application, with the exception of operations necessarily occurring in a certain order. Also, descriptions of features that are known, after an understanding of the disclosure of this application, may be omitted for increased clarity and conciseness.
(9) Although terms of “first” or “second” are used herein to describe various members, components, regions, layers, or sections, these members, components, regions, layers, or sections are not to be limited by these terms. Rather, these terms are only used to distinguish one member, component, region, layer, or section from another member, component, region, layer, or section. Thus, a first member, component, region, layer, or section referred to in examples described herein may also be referred to as a second member, component, region, layer, or section without departing from the teachings of the examples.
(10) Throughout the specification, when an element, such as a layer, region, or substrate, is described as being “on,” “connected to,” or “coupled to” another element, it may be directly “on,” “connected to,” or “coupled to” the other element, or there may be one or more other elements intervening therebetween. In contrast, when an element is described as being “directly on,” “directly connected to,” or “directly coupled to” another element, there can be no other elements intervening therebetween. Likewise, expressions, for example, “between” and “immediately between” and “adjacent to” and “immediately adjacent to” may also be construed as described in the foregoing.
(11) The terminology used herein is for the purpose of describing particular examples only, and is not to be used to limit the disclosure. As used herein, the singular forms “a”, “an”, and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. As used herein, the term “and/or” includes any one and any combination of any two or more of the associated listed items. As used herein, the terms “include,” “comprise,” and “have” specify the presence of stated features, numbers, operations, elements, components, and/or combinations thereof, but do not preclude the presence or addition of one or more other features, numbers, operations, elements, components, and/or combinations thereof. The use of the term “may” herein with respect to an example or embodiment (for example, as to what an example or embodiment may include or implement) means that at least one example or embodiment exists where such a feature is included or implemented, while all examples are not limited thereto.
(12) Unless otherwise defined, all terms used herein including technical or scientific terms have the same meanings as those generally understood consistent with and after an understanding of the present disclosure. Terms, such as those defined in commonly used dictionaries, should be construed to have meanings matching with contextual meanings in the relevant art and the present disclosure, and are not to be construed as an ideal or excessively formal meaning unless otherwise defined herein.
(13) Hereinafter, examples will be described in detail with reference to the accompanying drawings. When describing the examples with reference to the accompanying drawings, like reference numerals refer to like components and a repeated description related thereto will be omitted.
(14)
(15) The image reconstructing apparatus 100 may be an apparatus which photographs a moving dynamic object and reconstructs the photographed image to be clear. Therefore, as shown in
(16) The number m of used external illuminations may be determined by a type of a color filter array (CFA) of the camera 130. When the number of types of the CFA of the camera 130 is L, an external illumination (e.g., the external illuminator 120) which includes a light source having a wavelength corresponding to each color filter of the camera 130 is employed.
(17) That is, when the camera 130 uses a Bayer-pattern CFA, an image having three channels R, G, B may be acquired for one sheet so that an external illumination having three light sources is necessary. Further, when the camera 130 uses a near infrared filter (mono-NIR CFA), an image having two channels of monochrome and near infrared (NIR) is acquired in one image. In this case, an external illumination having two light sources may be used.
(18) Thereafter, a compensation pattern may be applied to multi-channel of a multi-frame. When a number of frames used for the compensation is C, the number of compensation patterns is M≤C×L. For example, when five frames are used and the number of photo sensors of the camera is four (L=4), M≤C×L=20.
(19) According to an exemplary embodiment, the driving controller 150 may control an external illumination to minimize noises caused by ambient light in an image photographed by the camera 130.
(20) To this end, as shown in
(21) The image processor 140 may acquire an image photographed by the camera 130 which is synchronized with the external illuminator 120 (that is, a photographed image to which the compensation pattern is applied) to reconstruct the image.
(22) Here, the external illuminator 120 may be a high speed illumination which is controlled by the image reconstructing apparatus 100.
(23) In step S110 of
(24) For example, the ambient light measurer 110 may measure the change in the amount of light in accordance with the shutter exposure time of the camera to photograph using a sensor of the ambient light measurer 110 such as an actinometer or a spectrometer. Alternatively, the ambient light measurer 110 may analyze and identify the image acquired by the camera 130.
(25) Here, the ambient light may include natural light and/or artificial light (which may not be controlled by a user). The artificial light may refer to light which is irradiated by an artificial illumination which is not controlled by the user (e.g., light that is not light irradiated by the external illuminator 120).
(26) In step S120 of
(27) The ambient light pattern generating module 142 may calculate and normalize a ratio r of a maximum value of brightness of the external light with respect to a maximum value of a brightness of the ambient light as expressed in Equation 1 below, for example.
r.sub.max=(Brightness of external illumination).sub.max/(Brightness of ambient light).sub.max Equation 1:
(28) Here, the external light may be light irradiated by the external illuminator 120.
(29) In step S130 of
(30) When a merged pattern is generated by reflecting the normalized ambient light pattern and the compensation pattern, a compensation pattern in which a dispersion of a frequency domain function of a point spread function of a joint blur is small and a minimum value of a signal is large may be generated. As illustrated in
(31) As the invertibility is reduced, the noise of the reconstructed image may be amplified. Therefore, after patterning the pattern of the ambient light through the observation, when patterns (compensation patterns) which compensates therefor are generated to control the external illumination, the noise may be minimized.
(32) For example, the merged pattern may be generated by calculating a coded factor where the coded factor is a value which minimizes a sum of dispersions of a modulated transfer function (MTF) of the merged pattern. Such a merged pattern may be expressed by the following Equation 2, for example.
(33)
(34) L compensation patterns may be generated by Equation 2. Compensation patterns Ω.sub.coded which compensate for L observed ambient light patterns may be generated. Here, L is the number of observed ambient light patterns and the number of compensation patterns is equal to the number of ambient light patterns.
(35) In the meantime, a sequence of the ambient light may be Q=[q.sub.1, q.sub.2, . . . , q.sub.n], a set of ambient light may be Ω.sub.amb=[Q.sub.1 . . . , Q.sub.m], a sequence of the external illuminations may be P=[p.sub.1, p.sub.2, . . . , p.sub.n], and the compensation pattern may be Ω.sub.coded=[P.sub.1, . . . , P.sub.m].
(36) When the sequence of the ambient light is Q=[q.sub.1, q.sub.2, . . . , q.sub.n], and similarly, the compensation pattern Ω.sub.coded is determined by Equation 2 in accordance with the pattern Ω.sub.amb of the ambient light and an amount Σ.sub.j.sup.mΣ.sub.i.sup.nq.sub.i.sup.j of ambient light, a value r may be calculated by the determined amount Σ.sub.j.sup.mΣ.sub.i.sup.nq.sub.i.sup.j of light of external illumination.
(37) In step S140 of
(38) The driving controller 150 may synchronize the external illuminator 120 with the camera 130 before controlling the external illumination. As the camera 130 may acquire light only for a shutter exposure time, an operation time of the external illumination may match the shutter exposure time of the camera. Accordingly, the driving controller 150 may match a trigger signal which operates the camera and Ω.sub.coded of the external illuminator 120.
(39) The external illuminator 120 and the camera 130 may operate by the synchronized trigger signal and the synchronized Ω.sub.coded to acquire an image. The image acquired as described above may be an image in which the compensation pattern compensating for the illumination pattern is reflected.
(40) In step S160 of
(41) The compensation pattern P is a binary pattern which controls the illumination and the pattern used to reconstruct the acquired image may be reconstructed by B below, for example.
(42)
(43) B.sub.i is a compensation pattern value of an i-th channel in which a motion blur coded by P.sub.i and Q.sub.i is generated.
(44) A non-limiting example of the image reconstructing process will be described below in more detail with reference to
(45)
(46) In step S161, a background image photographed before a dynamic object appears may be acquired.
(47) The image processor may acquire a clear background image before the compensation pattern is reflected (that is, before the dynamic object appears).
(48) In step S163, an image in which the dynamic object appears may be acquired by the camera synchronized with the external illumination to which the compensation pattern is applied and a foreground (a dynamic object) and the background may be separated from the acquired image. The image in which the dynamic object appears may be acquired by a camera which is synchronized with the operation of the external illumination to which the compensation pattern is applied.
(49) In step S165, the dynamic object may be reconstructed in the separated foreground image.
(50) A motion blur image may be expressed as follows, for example.
y=h*x+n
(51) Here, y is a motion-blurred acquired image, h is a motion blur kernel which is a point spread function, x is a clear image, and n is a noise. When the influence of n is insignificant, the reconstructed clear image may be expressed as follows, for example.
{circumflex over (x)}=A.sup.†y
(52) Here, A is a circulant matrix of h and A.sup.+ is a pseudo inverse.
(53) P.sub.j+Q.sub.j (for j=1, . . . , m) of Equation 2 expressed by P and Q generated according to the exemplary embodiment of the present disclosure corresponds to h of Equation 3 and thus an image {circumflex over (x)} which is reconstructed by the acquired image y may be expressed by the following Equation, for example.
{circumflex over (x)}.sub.L=A.sub.L.sup.†y.sub.L
(54) Here, L is an index indicating each channel of the used camera image.
(55) In step S167, the reconstructed foreground image is composed with the background image.
(56) The foreground image reconstructed in step S165 may be composed with the background image acquired in step S161.
(57)
(58) The graph represents MTF in which the ambient light pattern Q and the compensation pattern P are merged. In
(59) In
(60) As seen from
(61)
(62) Referring to
(63) The electronic apparatus 600 may be a computing device, an image acquisition device, or a display device. The electronic apparatus 600 may be, for example, a personal computer (PC), an advanced driver assistance system (ADAS), a head-up display (HUD) device, a camera, a 3D digital information display (DID), a navigation device, a neuromorphic device, a 3D mobile device, a smartphone, a smart television (TV), a smart vehicle, an internet of things (IoT) device, a medical device, or the like. The 3D mobile device may include, for example, a display device configured to display AR, virtual reality (VR), and/or mixed reality (MR), a head-mounted display (HMD), a face-mounted display (FMD), and AR glasses.
(64) The electronic apparatus 600 may include a processor 610 (e.g., one or more processors), a memory 630 (e.g., one or more memories), a sensor 670 (e.g., one or more sensors), and a communication interface 650. These components of the electronic apparatus 600 may communicate with one another through a communication bus 605.
(65) The processor 610 may control an overall operation of the electronic apparatus 600 and implement operations or methods by execution of instructions stored in the memory 630. The processor 610 may include the image processor 140 and the driving controller 150 described above, as a non-limiting example. The processor 610 may be configured to perform one or more or all steps, operations, or methods described above with reference to
(66) The memory 630 may store information used by the processor 610 to perform operations. For example, the memory 630 may store instructions, which when executed by the processor 610, configure the processor to perform one or more or all steps, operations, or methods described above with reference to
(67) The sensor 670 may include the ambient light measurer 110 and the camera 130 of
(68) The communication interface 650 may communicate externally. The communication interface 650 may include the illuminator 120 of
(69) The image reconstructing apparatuses, light measurers, external illuminators, cameras, image processors, driving controllers, electronic apparatuses, processors, memories, sensors, communication interfaces, image reconstructing apparatus 100, light measurer 110, external illuminator 120, camera 130, image processor 140, driving controller 150, electronic apparatus 600, processor 610, memory 630, sensor 670, communication interface 650, and other apparatuses, devices, units, modules, and components described herein with respect to
(70) The methods illustrated in
(71) Instructions or software to control computing hardware, for example, one or more processors or computers, to implement the hardware components and perform the methods as described above may be written as computer programs, code segments, instructions or any combination thereof, for individually or collectively instructing or configuring the one or more processors or computers to operate as a machine or special-purpose computer to perform the operations that are performed by the hardware components and the methods as described above. In one example, the instructions or software include machine code that is directly executed by the one or more processors or computers, such as machine code produced by a compiler. In another example, the instructions or software includes higher-level code that is executed by the one or more processors or computer using an interpreter. The instructions or software may be written using any programming language based on the block diagrams and the flow charts illustrated in the drawings and the corresponding descriptions in the specification, which disclose algorithms for performing the operations that are performed by the hardware components and the methods as described above.
(72) The instructions or software to control computing hardware, for example, one or more processors or computers, to implement the hardware components and perform the methods as described above, and any associated data, data files, and data structures, may be recorded, stored, or fixed in or on one or more non-transitory computer-readable storage media. Examples of a non-transitory computer-readable storage medium include read-only memory (ROM), random-access programmable read only memory (PROM), electrically erasable programmable read-only memory (EEPROM), random-access memory (RAM), dynamic random access memory (DRAM), static random access memory (SRAM), flash memory, non-volatile memory, CD-ROMs, CD-Rs, CD+Rs, CD-RWs, CD+RWs, DVD-ROMs, DVD-Rs, DVD+Rs, DVD-RWs, DVD+RWs, DVD-RAMs, BD-ROMs, BD-Rs, BD-R LTHs, BD-REs, blue-ray or optical disk storage, hard disk drive (HDD), solid state drive (SSD), flash memory, a card type memory such as multimedia card micro or a card (for example, secure digital (SD) or extreme digital (XD)), magnetic tapes, floppy disks, magneto-optical data storage devices, optical data storage devices, hard disks, solid-state disks, and any other device that is configured to store the instructions or software and any associated data, data files, and data structures in a non-transitory manner and provide the instructions or software and any associated data, data files, and data structures to one or more processors or computers so that the one or more processors or computers can execute the instructions. In one example, the instructions or software and any associated data, data files, and data structures are distributed over network-coupled computer systems so that the instructions and software and any associated data, data files, and data structures are stored, accessed, and executed in a distributed fashion by the one or more processors or computers.
(73) While this disclosure includes specific examples, it will be apparent after an understanding of the disclosure of this application that various changes in form and details may be made in these examples without departing from the spirit and scope of the claims and their equivalents. The examples described herein are to be considered in a descriptive sense only, and not for purposes of limitation. Descriptions of features or aspects in each example are to be considered as being applicable to similar features or aspects in other examples. Suitable results may be achieved if the described techniques are performed in a different order, and/or if components in a described system, architecture, device, or circuit are combined in a different manner, and/or replaced or supplemented by other components or their equivalents.