Multi-aperture camera system having auto focusing function and/or depth estimation function

10613417 ยท 2020-04-07

Assignee

Inventors

Cpc classification

International classification

Abstract

A multi-aperture camera system having an auto focusing function comprises: multiple apertures; an image sensor that creates multiple images by processing light signals introduced through the multiple apertures, respectively; and an auto focusing unit that determines a distance by which the image sensor moves relative to the multiple apertures by using the multiple images for auto focusing, wherein at least one of the multiple apertures has a central position that is misaligned with those of the remaining apertures other than the at least one aperture.

Claims

1. A multi-aperture camera system having an auto focusing function, the system comprising: a plurality of apertures; an image sensor configured to generate a plurality of images by processing an optical signal introduced through each of the plurality of apertures; and an auto focusing unit configured to determine a distance the image sensor moves using the characteristics of the plurality of apertures and the plurality of images for auto focusing, wherein at least one aperture among the plurality of apertures has a central location which is misaligned with a central location of the other aperture, wherein the auto focusing unit determines a distance where the image sensor moves relative to the plurality of apertures, based on a difference between central locations of the plurality of images, a distance between a central location of the at least one aperture and a central location of the other aperture, a distance between an optical system in which the plurality of apertures are formed and the image sensor, a subject distance which is focused on the image sensor, and a focal distance.

2. The system of claim 1, wherein the plurality of apertures are formed on one optical system.

3. The system of claim 1, wherein the at least one aperture is formed on the other aperture.

4. The system of claim 3, wherein the other aperture js formed by including a filter which detects a specific optical signal in an optical system in which the plurality of apertures are formed.

5. The system of claim 4, wherein the at least one aperture is formed by etching a specific region on the other aperture.

6. The system of claim 1, wherein the at least one aperture and the other aperture are formed to have one of a circle, a triangle, a quadrangle, or a polygon.

7. The system of claim 1, wherein the at least one aperture introduces an optical signal of a different wavelength from a wavelength of an optical signal introduced by the other aperture.

8. The system of claim 1, wherein the auto focusing unit verifies the distance where the image sensor moves relative to the plurality of apertures, using a difference between blur levels in the plurality of images.

Description

DESCRIPTION OF DRAWINGS

(1) FIG. 1 is a drawing illustrating a principle for an auto focusing function and a depth estimation function of a camera system according to an embodiment of the present disclosure;

(2) FIG. 2 is a drawing illustrating an auto focusing function and a depth estimation function of a camera system according to an embodiment of the present disclosure;

(3) FIG. 3 is a drawing illustrating a plurality of apertures according to an embodiment of the present disclosure;

(4) FIGS. 4A, 4B, and 4C are drawings illustrating a plurality of apertures according to another embodiment of the present disclosure;

(5) FIG. 5 is a flowchart illustrating an auto focusing method of a camera system according to an embodiment of the present disclosure;

(6) FIG. 6 is a block diagram illustrating a camera system having an auto focusing function according to an embodiment of the present disclosure;

(7) FIG. 7 is a flowchart illustrating a depth estimation method of a camera system according to an embodiment of the present disclosure; and

(8) FIG. 8 is a block diagram illustrating a camera system having a depth estimation function according to an embodiment of the present disclosure.

BEST MODE

(9) Hereinafter, a description will be given in detail of embodiments with reference to the accompanying drawings. However, the present disclosure is restricted or limited to embodiments of the present disclosure. Further, like reference numerals shown in each drawing indicate like members.

(10) Further, the terminology used in the specification may be terms used to properly represent an exemplary embodiment of the present disclosure and may vary according to intention of a user or an operator or custom of a field included in the present disclosure. Therefore, the terminology will be defined based on contents across the specification.

(11) FIG. 1 is a drawing illustrating a principle for an auto focusing function and a depth estimation function of a camera system according to an embodiment of the present disclosure.

(12) Referring to FIG. 1, if an image surface 110 on which an image of a subject is focused is located on location 1 which is a focal distance in the camera system according to an embodiment of the present disclosure (if an image sensor which generates an image by processing an optical signal is located in the image surface 110), since the subject is correctly focused, an optical signal introduced through each of a plurality of apertures 120 and 130 may be processed to generate a clear image. Hereinafter, an embodiment is exemplified as the plurality of apertures 120 and 130 included in the camera system include the aperture 120 which introduces a visible ray and the aperture 130 which introduces an infrared ray. However, embodiments are not limited thereto. For example, the plurality of apertures 120 and 130 may include various apertures which introduce optical signals of different wavelengths.

(13) However, if the image surface 110 is located on location 2, since the subject is incorrectly focused and a blur occurs, an optical signal introduced through each of the plurality of apertures 120 and 130 may be processed to generate an image in which the blur exists. In this case, since a central location of a visible image generated by the visible aperture 120 among a plurality of images for the subject is not identical to a central location of an infrared image generated by the infrared aperture 130, there may be a phenomenon in which the center of the infrared image is skewed towards the right with respect to the center of the visible image. On the other hand, if the image surface 110 is located on location 3, there may be a phenomenon in which the center of the infrared image is skewed towards the left with respect to the center of the visible image.

(14) The camera system according to an embodiment of the present disclosure may have an auto focusing function and a depth estimation function using the above-mentioned principle by including the plurality of apertures 120 and 130, central locations of which are misaligned with each other. A detailed description for this is given hereafter.

(15) FIG. 2 is a drawing illustrating an auto focusing function and a depth estimation function of a camera system according to an embodiment of the present disclosure.

(16) Referring to FIG. 2, the camera system according to an embodiment of the present disclosure may have the auto focusing function and the depth estimation function by including a plurality of apertures 210 and 220 using the above-mentioned principle.

(17) In detail, a disparity p between a central location of a visible image generated by the visible aperture 210 among the plurality of apertures 210 and 220 and a central location of an infrared image generated by the infrared aperture 220 among the plurality of apertures 210 and 220 may be calculated like Equation 1 based on the above-mentioned principle.

(18) p = x D .Math. f 2 F # ( a 0 - f ) ( a 0 a - 1 ) [ Equation 1 ]

(19) In Equation 1, x may refer to a disparity between the central location of the visible aperture 210 and the central location of the infrared aperture 220, D may refer to a diameter of the visible aperture 210, f may refer to a focal distance, F # may refer to a brightness value of a lens in the camera system, a may refer to a subject distance, and a.sub.0 may refer to a subject distance which is focused on an image sensor 230.

(20) In this case, if a value of the disparity p between the central location of the visible image and the central location of the infrared image is changed from a positive number to a negative number or is changed from the negative number to the positive number, a direction of a disparity between the two images may be changed. Thus, it may be distinguished whether a subject having a boundary blur is located in the foreground or background, according to a sign of a value of the disparity p.

(21) Further, depth a for the subject (a depth from the subject to an optical system 240 in which the plurality of apertures 210 and 220 are formed) may be calculated like Equation 2 from Equation 1.

(22) a = a 0 1 + ( a 0 - f ) f .Math. p x [ Equation 2 ]

(23) In Equation 2, a.sub.0 may refer to a subject distance which is focused on the image sensor 230, f may refer to a focal distance, p may refer to a disparity between the central location of the visible image and the central location of the infrared image, and x may refer to a disparity between the central location of the visible aperture 210 and the central location of the infrared aperture 220.

(24) Further, distance b in which the image sensor 230 moves for auto focusing may be calculated like Equation 3.

(25) b = b 0 x p - 1 = 1 x p - 1 .Math. a 0 f a 0 - f [ Equation 3 ]

(26) In Equation 3, P may refer to a disparity between the central location of the visible image and the central location of the infrared image, x may refer to a disparity between the central location of the visible aperture 210 and the central location of the infrared aperture 220, b.sub.0 may refer to a distance between the optical system 240 in which the plurality of apertures 210 and 220 are formed and the image sensor 230, a.sub.0 may refer to a subject distance which is focused on the image sensor 230, and f may refer to a focal distance.

(27) As such, the camera system according to an embodiment of the present disclosure may estimate a depth like Equation 2 and may automatically adjust a focus like Equation 3 by including the plurality of apertures 210 and 220 formed such that their central locations are misaligned with each other.

(28) In this case, the plurality of apertures 210 and 220 may be formed on the one optical system 240. A detailed description for this will be given with reference to FIGS. 3 to 4C.

(29) Further, the camera system may verify the estimated depth and the adjusted focus like Equations 4 to 6 using a difference between blur levels in a plurality of images respectively generated by the plurality of apertures 210 and 220.

(30) First of all, blur d of each of a visible image and an infrared image may be represented like Equation 4.

(31) d = f 2 F # ( a 0 - f ) .Math. a 0 a - 1 .Math. [ Equation 4 ]

(32) In Equation 4, f may refer to a focal distance, F # may refer to a brightness value of a lens in the camera system, a may refer to a depth to the optical system 240 in which the plurality apertures 210 and 220 are formed, and a.sub.0 may refer to a subject distance which is focused on the image sensor 230.

(33) Thus, the camera system may calculate Equation 5 indicating a depth from the subject to the optical system 240 in which the plurality apertures 210 and 220 are formed and may calculate Equation 6 indicating a distance b in which the image sensor 230 moves for auto focusing.

(34) a = a 0 1 + F # ( a 0 - f ) f 2 d ( a < a 0 ) [ Equation 5 ] a = a 0 1 + F # ( a 0 - f ) f 2 d ( a < a 0 )

(35) In Equation 5, F # may refer to a brightness value of the lens in the camera system, a.sub.0 may refer to a subject distance which is focused on the image sensor 230, f may refer to a focal distance, and d may refer to a level of a blur of each of a visible image and an infrared image or a disparity between blurs of the visible image and the infrared image. Thus, the camera system may verify a depth estimated from Equation 2 using Equation 5.

(36) b = b 0 .Math. f dF # - 1 .Math. = 1 .Math. f dF # - 1 .Math. .Math. a 0 f a 0 - f [ Equation 6 ]

(37) In Equation 6, d may refer to a level of a blur of each of the visible image and the infrared image or a disparity between blurs of the visible image and the infrared image, f may refer to a focal distance, F # may refer to a brightness value of the lens in the camera system, b.sub.0 may refer to a distance between the optical system 240 in which the plurality of apertures 210 and 220 are formed and the image sensor 230, and a.sub.0 may refer to a subject distance which is focused on the image sensor 230. Thus, the camera system may verify the distance in which the image sensor 230 move for auto focusing, calculated from Equation 3, using Equation 6.

(38) FIG. 3 is a drawing illustrating a plurality of apertures according to an embodiment of the present disclosure.

(39) Referring to FIG. 3, at least one aperture 310 among a plurality of apertures 310 and 320 according to an embodiment of the present disclosure may have a central location which is misaligned with a central location of the other aperture 320 except for the at least one aperture 310 among the plurality of apertures 310 and 320. Hereinafter, an embodiment is exemplified as the plurality of apertures 310 and 320 are configured with the two apertures. However, embodiments are not limited thereto. For example, the plurality of apertures 310 and 320 may be configured with three apertures, four apertures, or the like. A detailed description for this will be given with reference to FIG. 4.

(40) Further, the plurality of apertures 310 and 320 may be formed on one optical system. Particularly, the at least one aperture 310 among the plurality of apertures 310 and 320 may be formed on the other aperture 320.

(41) In this case, the other aperture 320 may be formed by including a filter which detects a specific optical signal in the optical system in which the plurality of apertures 310 and 320 are formed, and the at least one aperture 310 may be formed by etching a specified region on the other aperture 320. For example, the other aperture 320 may be formed by coating a filter which blocks an optical signal of greater than or equal to a wavelength of 650 nm on a rear surface of an optical system which is formed of a glass plate and coating a filter which blocks an optical signal of greater than or equal to a wavelength of 810 nm on a front surface of the optical system which is formed of the glass plate. Further, the at least one aperture 310 may be formed by etching a specific region on the other aperture 320 to have a central location which is misaligned with a central location of the other aperture 320.

(42) Thus, the at least one aperture 310 may introduce an optical signal of a different wavelength from a wavelength of an optical signal introduced by the other aperture 320. For example, the optical signal introduced through the other aperture 320 may be an optical signal of a visible wavelength, and the optical signal introduced through the at least one aperture 310 may be an optical signal of a visible wavelength and an optical signal of an infrared wavelength.

(43) In this case, each of the plurality of apertures may be formed in various forms to have one of a circle, a triangle, a quadrangle, or a polygon.

(44) Further, the optical system in which the plurality of apertures 310 and 320 are formed may be adaptively located on an upper or lower portion of a lens included in a camera system.

(45) FIGS. 4A to 4C are drawings illustrating a plurality of apertures according to another embodiment of the present disclosure.

(46) Referring to FIGS. 4A to 4C, the plurality of apertures included in a camera system according to an embodiment of the present disclosure may have various number of apertures included in the plurality of apertures or various forms of apertures included in the plurality of apertures.

(47) For example, as shown in reference numeral 410, the plurality of apertures included in the camera system may be configured with a first aperture which introduces a red-green (RG) signal and a second aperture which introduces a blue (B) signal, the first aperture and the second aperture being formed on one optical system. Further, as shown in reference numeral 420, the plurality of apertures may be configured with a first aperture which introduces an RGB signal and a second aperture which introduces an infrared (IR) signal.

(48) The plurality of apertures are not restricted or limited to FIGS. 4A to 4C and may be configured with various forms of various number of apertures.

(49) FIG. 5 is a flowchart illustrating an auto focusing method of a camera system according to an embodiment of the present disclosure.

(50) Referring to FIG. 5, in operation 510, the camera system according to an embodiment of the present disclosure may generate a plurality of images by processing an optical signal introduced through each of a plurality of apertures.

(51) Herein, at least one aperture among the plurality of apertures may have a central location which is misaligned with a central location of the other aperture except for the at least one aperture among the plurality of apertures. In this case, the at least one aperture may introduce an optical signal of a different wavelength from a wavelength of an optical signal introduced by the other aperture.

(52) Further, the plurality apertures may be formed on one optical system. Particularly, the at least one aperture among the plurality of apertures may be formed on the other aperture except for the at least one aperture among the plurality of apertures.

(53) For example, the other aperture may be formed by including a filter which detects a specific optical signal in the optical system where the plurality of apertures are formed, and the at least one aperture may be formed by etching a specific region on the other aperture.

(54) Further, each of the plurality of apertures may be formed to have one of a circle, a triangle, a quadrangle, or a polygon.

(55) In operation 510, the camera system may process the plurality of images and may calculate parameters for auto focusing. For example, the camera system may calculate a disparity between central locations of the plurality of images, a distance between a central location of the at least one aperture and a central location of the other aperture, a distance between the optical system in which the plurality of apertures are formed and an image sensor, a subject distance which is focused on the image sensor, a focal distance, and the like, as the parameters for the auto focusing.

(56) In operation 520, the camera system may determine a moving distance of the image sensor using the characteristics of the plurality of apertures and the plurality of images for auto focusing.

(57) In detail, the camera system may determine a moving distance of the image sensor using the characteristics of the plurality of apertures, based on a difference between central locations of the plurality of images, a distance between a central location of the at least one aperture and a central location of the other aperture, a distance between the optical system in which the plurality of apertures are formed and the image sensor, a subject distance which is focused on the image sensor, and a focal distance, which are obtained from the plurality of images.

(58) Further, although not illustrated in FIG. 5, the camera system may verify the distance of the image sensor using the characteristics of the plurality of apertures and a difference between blur levels in the plurality of images.

(59) Thus, the camera system may focus by moving the image sensor or may moving a lens barrel in which the optical system where the plurality of apertures are formed is fixed to relatively move the mage sensor, based on the distance determined in operation 520.

(60) FIG. 6 is a block diagram illustrating a camera system having an auto focusing function according to an embodiment of the present disclosure.

(61) Referring to FIG. 6, the camera system according to an embodiment of the present disclosure may include a plurality of apertures 610, an image sensor 620, and an auto focusing unit 630.

(62) Herein, at least one aperture among the plurality of apertures 610 may have a central location which is misaligned with a central location of the other aperture except for the at least one aperture among the plurality of apertures 610. In this case, the at least one aperture may introduce an optical signal of a different wavelength from a wavelength of an optical signal introduced by the other aperture.

(63) Further, the plurality of apertures 610 may be formed on one optical system. Particularly, the at least one aperture among the plurality of apertures 610 may be formed on the other aperture except for the at least one aperture among the plurality of apertures 610.

(64) For example, the other aperture may be formed by including a filter which detects a specific optical signal in the optical system where the plurality of apertures 610 are formed, and the at least one aperture may be formed by etching a specific region on the other aperture.

(65) Further, each of the plurality of apertures 610 may be formed to have one of a circle, a triangle, a quadrangle, or a polygon.

(66) The image sensor 620 may generate a plurality of images by processing an optical signal introduced through each of the plurality of apertures 610.

(67) In this case, the image sensor 620 may process the plurality of images and may calculate parameters for auto focusing. For example, the image sensor 620 may calculate a disparity between central locations of the plurality of images, a distance between a central location of the at least one aperture and a central location of the other aperture, a distance between the optical system in which the plurality of apertures 610 are formed and the image sensor 620, a subject distance which is focused on the image sensor 620, a focal distance, and the like, as the parameters for the auto focusing.

(68) The auto focusing unit 630 may determine a moving distance of the image sensor 620 using the characteristics of the plurality of apertures 610 and the plurality of images for auto focusing.

(69) In detail, the auto focusing unit 630 may determine a moving distance of the image sensor 620 using the characteristics of the plurality of apertures 610, based on a difference between central locations of the plurality of images, a distance between the central location of the at least one aperture and the central location of the other aperture, a distance between the optical system in which the plurality of apertures 610 are formed and the image sensor 620, a subject distance which is focused on the image sensor 620, and a focal distance, which are obtained from the plurality of images.

(70) Further, although not illustrated in FIG. 6, the auto focusing unit 630 may verify the moving distance of the image sensor 620 using the characteristics of the plurality of apertures 610 and a difference between blur levels in the plurality of images.

(71) Thus, the auto focusing unit 630 may focus by moving the image sensor 620 or may moving a lens barrel in which the optical system where the plurality of apertures 610 are formed is fixed to relatively move the mage sensor 620, based on the determined distance.

(72) FIG. 7 is a flowchart illustrating a depth estimation method of a camera system according to an embodiment of the present disclosure.

(73) Referring to FIG. 7, in operation 710, the camera system according to an embodiment of the present disclosure may generate a plurality of images by processing an optical signal introduced through each of a plurality of apertures.

(74) Herein, at least one aperture among the plurality of apertures may have a central location which is misaligned with a central location of the other aperture except for the at least one aperture among the plurality of apertures. In this case, the at least one aperture may introduce an optical signal of a different wavelength from a wavelength of an optical signal introduced by the other aperture.

(75) Further, the plurality apertures may be formed on one optical system. Particularly, the at least one aperture among the plurality of apertures may be formed on the other aperture except for the at least one aperture among the plurality of apertures.

(76) For example, the other aperture may be formed by including a filter which detects a specific optical signal in the optical system where the plurality of apertures are formed, and the at least one aperture may be formed by etching a specific region on the other aperture.

(77) Further, each of the plurality of apertures may be formed to have one of a circle, a triangle, a quadrangle, or a polygon.

(78) In operation 710, the camera system may process the plurality of images and may calculate parameters for depth estimation. For example, the camera system may calculate a disparity between central locations of the plurality of images, a distance between a central location of the at least one aperture and a central location of the other aperture, a subject distance which is focused on an image sensor, a focal distance, and the like, as the parameters for the depth estimation.

(79) In operation 720, the camera system may estimate a depth from a subject to the optical system in which the plurality of apertures are formed, using the plurality of images.

(80) In detail, the camera system may estimate the depth from the subject to the optical system in which the plurality of apertures are formed, based on a disparity between the central locations of the plurality of images, a distance between the central location of the at least one aperture and the central location of the other aperture, a subject distance which is focused on the image sensor, and a focal distance, which are obtained from the plurality of images.

(81) Further, although not illustrated in FIG. 7, the camera system may verify the depth from the subject to the optical system in which the plurality of apertures are formed, using a disparity between blur levels in the plurality of images.

(82) FIG. 8 is a block diagram illustrating a camera system having a depth estimation function according to an embodiment of the present disclosure.

(83) Referring to FIG. 8, the camera system according to an embodiment of the present disclosure may include a plurality of apertures 810, an image sensor 820, and a depth estimation unit 830.

(84) Herein, at least one aperture among the plurality of apertures 810 may have a central location which is misaligned with a central location of the other aperture except for the at least one aperture among the plurality of apertures 810. In this case, the at least one aperture may introduce an optical signal of a different wavelength from a wavelength of an optical signal introduced by the other aperture.

(85) Further, the plurality apertures 810 may be formed on one optical system. Particularly, the at least one aperture among the plurality of apertures 810 may be formed on the other aperture except for the at least one aperture among the plurality of apertures 810.

(86) For example, the other aperture may be formed by including a filter which detects a specific optical signal in the optical system where the plurality of apertures 810 are formed, and the at least one aperture may be formed by etching a specific region on the other aperture.

(87) Further, each of the plurality of apertures 810 may be formed to have one of a circle, a triangle, a quadrangle, or a polygon.

(88) The image sensor 820 may generate a plurality of images by processing an optical signal introduced through each of the plurality of apertures 810.

(89) In this case, the image sensor 820 may process the plurality of images and may calculate parameters for depth estimation. For example, the image sensor 820 may calculate a disparity between central locations of the plurality of images, a distance between a central location of the at least one aperture and a central location of the other aperture, a subject distance which is focused on the image sensor 820, a focal distance, and the like, as the parameters for the depth estimation.

(90) The depth estimation unit 830 may estimate a depth from a subject to the optical system in which the plurality of apertures are formed, using the plurality of images.

(91) In detail, the depth estimation unit 830 may estimate the depth from the subject to the optical system in which the plurality of apertures are formed, based on a disparity between the central locations of the plurality of images, a distance between the central location of the at least one aperture and the central location of the other aperture, a subject distance which is focused on the image sensor 820, and a focal distance, which are obtained from the plurality of images.

(92) Further, although not illustrated in FIG. 8, the depth estimation unit may verify the depth from the subject to the optical system in which the plurality of apertures 810 are formed, using a disparity between blur levels in the plurality of images.

MODE FOR INVENTION

(93) While a few exemplary embodiments have been shown and described with reference to the accompanying drawings, it will be apparent to those skilled in the art that various modifications and variations can be made from the foregoing descriptions. For example, adequate effects may be achieved even if the foregoing processes and methods are carried out in different order than described above, and/or the aforementioned elements, such as systems, structures, devices, or circuits, are combined or coupled in different forms and modes than as described above or be substituted or switched with other components or equivalents.

(94) Therefore, other implements, other embodiments, and equivalents to claims are within the scope of the following claims.