Solid-state imaging device, manufacturing method thereof, and camera with arranged pixel combinations alternatively
09793313 ยท 2017-10-17
Assignee
Inventors
Cpc classification
H10F39/803
ELECTRICITY
H10F39/806
ELECTRICITY
H04N25/77
ELECTRICITY
H04N25/75
ELECTRICITY
H04N25/78
ELECTRICITY
International classification
Abstract
A solid-state imaging device includes a semiconductor substrate; and a pixel unit having a plurality of pixels on the semiconductor substrate, wherein the pixel unit includes first pixel groups having two or more pixels and second pixel groups being different from the first pixel groups, wherein a portion of the pixels in the first pixel groups and a portion of the pixels in the second pixel groups share a floating diffusion element.
Claims
1. An imaging device, comprising: a pixel array including a plurality of sub-arrays, respective ones of the plurality of sub-arrays including four pixels arranged in a 22 matrix, respective ones of the pixels including a photodiode and a transfer transistor, and a plurality of on-chip lenses, wherein each pixel of a first sub-array of the plurality of sub-arrays and a second sub-array of the plurality of sub-arrays includes a green-light-transmitting filter, each pixel of a third sub-array of the plurality of sub-arrays includes a blue-light-transmitting filter, each pixel of a fourth sub-array of the plurality of sub-arrays includes a red-light-transmitting filter, the first sub-array is disposed diagonally to the second sub-array, the third sub-array is disposed diagonally to the fourth sub-array, and each of the plurality of on-chip lenses is disposed corresponding to the four pixels in a different one of the first, second and third sub-arrays.
2. The imaging device according to claim 1, wherein four pixels of the pixel array share a floating diffusion element.
3. The imaging device according to claim 1, further comprising a plurality of optical waveguides, wherein a respective optical waveguide is disposed between the respective on-chip lens and the corresponding sub-array.
4. The imaging device according to claim 3, wherein a respective optical waveguide corresponding to the first sub-array, a respective optical waveguide corresponding to the second sub-array, a respective optical waveguide corresponding to the third sub-array, and a respective optical waveguide corresponding to the fourth sub array have a same opening size as one another.
5. The imaging device according to claim 1, wherein a respective on-chip lens corresponding to the first sub-array, a respective on-chip lens corresponding to the second sub-array, a respective on-chip lens corresponding to the third sub-array, and a respective on-chip lens corresponding to the fourth sub array have a same diameter as one another.
6. The imaging device according to claim 1, further comprising a conductive layer disposed between the first sub-array and the second sub-array.
7. The imaging device according to claim 1, wherein four pixels of the pixel array share an amplification transistor and a reset transistor.
8. The imaging device according to claim 1, wherein the four pixels of the first, second, third and fourth sub-arrays are separated by a first insulating material.
9. The imaging device according to claim 8, wherein each of the first, second, third and fourth sub-arrays has only four pixels separated by the first insulating material.
10. The imaging device according to claim 1, wherein the green-light-transmitting filter comprises a single filter disposed corresponding to the four pixels of the first sub-array.
11. The imaging device according to claim 1, wherein each of the plurality of optical waveguides comprises a second insulating material, and the second insulating material comprises silicon oxide, silicon nitride, silicon carbide, or a combination thereof.
12. An imaging system, comprising: an imaging device, including: a pixel array including a plurality of sub-arrays, respective ones of the plurality of sub-arrays including four pixels arranged in a 22 matrix, respective ones of the pixels including a photodiode and a transfer transistor, and a plurality of on-chip lenses, wherein each pixel of a first sub-array of the plurality of sub-arrays and a second sub-array of the plurality of sub-arrays includes a green-light-transmitting filter, each pixel of a third sub-array of the plurality of sub-arrays includes a blue-light-transmitting filter, each pixel of a fourth sub-array of the plurality of sub-arrays includes a red-light-transmitting filter, the first sub-array is disposed diagonally to the second sub-array, the third sub-array is disposed diagonally to the fourth sub-array, and each of the plurality of on-chip lenses is disposed corresponding to the four pixels in a different one of the first, second and third sub-arrays.
13. The imaging system according to claim 12, wherein four pixels of the pixel array share a floating diffusion element.
14. The imaging system according to claim 12, further comprising a plurality of optical waveguides, wherein a respective optical waveguide is disposed between the respective on-chip lens and the corresponding sub-array.
15. The imaging system according to claim 14, wherein a respective optical waveguide corresponding to the first sub-array, a respective optical waveguide corresponding to the second sub-array, a respective optical waveguide corresponding to the third sub-array, and a respective optical waveguide corresponding to the fourth sub array have a same opening size as one another.
16. The imaging system according to claim 12, wherein a respective on-chip lens corresponding to the first sub-array, a respective on-chip lens corresponding to the second sub-array, a respective on-chip lens corresponding to the third sub-array, and a respective on-chip lens corresponding to the fourth sub array have a same diameter as one another.
17. The imaging system according to claim 12, further comprising a conductive layer disposed between the first sub-array and the second sub-array.
18. The imaging system according to claim 12, wherein four pixels of the pixel array share an amplification transistor and a reset transistor.
19. The imaging system according to claim 12, further comprising an optical system configured to form an image on an imaging surface of the imaging device.
20. The imaging system according to claim 12, further comprising a signal processing circuit configured to process an output signal of the imaging device and output a video signal.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
(1)
(2)
(3)
(4)
(5)
(6)
(7)
(8)
(9)
(10)
(11)
(12)
(13)
(14)
(15)
(16)
DESCRIPTION OF THE PREFERRED EMBODIMENTS
(17) Hereinafter, the description will be made of the solid-state imaging device, the manufacturing method thereof, and the camera which includes the solid-state imaging device according to the embodiments of the present invention with reference to the accompanying drawings.
(18) In this regard, the description will be made in the following order: 1. First Embodiment (Configuration in Which Two Green Pixels Are Arranged Both in Horizontal and Vertical Directions and Total of Four Green Pixels Are Arranged) 2. First Modified Example 3. Second Modified Example 4. Second Embodiment (Configuration in Which Each of Green, Blue, and Red Pixels Are Arranged by Two Both in Horizontal and Vertical Directions and Total of Four of Respective Color Pixels Are Arranged) 5. Third Embodiment (Configuration in Which a Floating Diffusion Is Shared Between Pixels) 6. Fourth Embodiment (Configuration of Upper Layer Wiring) 7. Fifth Embodiment (Configuration in Which Pluralities of Green Pixels Are Arranged in First and Second Directions) 8. First Working Example 9. Second Working Example 10. Third Working Example 11. Fourth Working Example 12. Sixth Embodiment (Camera Employing Solid-State Imaging Device)
First Embodiment
(19) [Overall Configuration of Solid-State Imaging Device]
(20)
(21) In the solid-state imaging device according to this embodiment, photodiodes are formed separately for each pixel arranged on a light sensing surface of a semiconductor substrate in a matrix shape. In addition, a signal reading unit for reading a signal charge generated and charged in the photodiodes and a voltage in accordance with the signal charge is formed on the semiconductor substrate. Moreover, an insulating film including an optical waveguide is formed on the semiconductor substrate so as to cover the photodiode, a color filter is formed on the insulating film, and an on-chip lens is formed on the color filter.
(22) [Layout of Color Filters and On-Chip Lenses]
(23) The layout shown in
(24) That is, there are first pixel combinations 1 and second pixel combinations 2. The first pixel combination 1 includes a configuration in which two green pixels (G) are arranged both in the horizontal and vertical directions and a total of four green pixels are arranged while provided with a green color filter. The second pixel combination 2 has configuration in which two pixels are arranged both in the horizontal and vertical directions and a total of four pixels are arranged, and two red pixels (R) having red color filters and two blue pixels (B) having blue color filters are arranged cater cornered. In the layout, these first pixel combinations 1 and the second pixel combinations 2 are alternately arranged both in the horizontal and vertical directions.
(25) The layout shown in
(26)
(27) As shown in
(28) In addition, one on-chip lens (OCL.sub.B or OCL.sub.R) is formed with respect to each pixel of the second pixel combination 2 as the on-chip lens. The diameter of one on-chip lens (OCL.sub.B or OCL.sub.R) for each pixel of the second pixel combination 2 is different from the diameter of one on-chip lens OCL.sub.G for the first pixel combination 1.
(29) Alternatively, one on-chip lens may be formed for each pixel of the first pixel combination 1 in the same manner as in the second pixel combination 2.
(30) [Layer Configuration of Solid-State Imaging Device]
(31)
(32)
(33) In the solid-state imaging device according to this embodiment, separation insulating films 11 for separating the pixels are formed on the semiconductor substrate 10, and semiconductor regions 12 including photodiodes are formed in the area separated by the separation insulating films 11. As described above, the pixels are arranged and formed in the matrix shape on the light sensing surface of the semiconductor substrate.
(34) In addition, a signal reading unit (not shown) for reading a signal charge generated and charged in the photodiodes and voltage in accordance with the signal charge is formed on the semiconductor substrate. This is constituted by a MOS transistor, which is known as a floating diffusion and a source follower in the CMOS image sensor. In addition, this is constituted by a CCD channel in the CCD image sensor.
(35) A first insulating film 13 is formed on the semiconductor substrate so as to cover the photodiodes. Conductive layers 14 which constitute the upper layer wiring are embedded in the first insulating film 13 above the region where the separation insulating films 11 are formed.
(36) In addition, openings 15 are formed in the first insulating film 13 above the photodiodes. Second insulating film 16 is formed so as to fill in the openings 15.
(37) The second insulating film 16 is made by a material with a higher refractive index than that of the first insulating film 13. For example, the first insulating film 13 is made by oxide silicon, silicon nitride, silicon carbide, and the laminated body thereof. The second insulating film 16 is made by a resin with a high refractive index such as siloxane resin or polyimide, and preferably made by siloxane resin.
(38) In addition, the above-described resin contains metal oxides particles such as titanium oxide, tantalum oxide, niobium oxide, tungsten oxide, zirconium oxide, zinc oxide, indium oxide, or hafnium oxide, which increases the refractive index.
(39) With the above configuration in which the insulating film with a high refractive index (the second insulating film 16) is embedded in the openings 14 of the insulating film with a low refractive index (the first insulating film 12), the optical waveguide for guiding the incident light from the upper direction is formed.
(40) The color filters (17B, 17G, 17R) are formed on the insulating films (12, 16), and the on-chip lens 18 is formed on the color filters (17B, 17G, 17R). As shown in
(41) Here, as shown in
(42) For example, with regard to the optical waveguides, one optical waveguide is formed for each pixel of the second pixel combination 2, and one opening size of the optical waveguide for each pixel of the second pixel combination 2 is different from that of the optical waveguide for the first pixel combination 1.
(43) On the other hand,
(44) This case is different from the one shown in
(45) [Configuration of Color Filters]
(46)
(47)
(48) The color filters of the green pixel G1 and the blue pixel B2 are overlapped with each other, the color filters of the green pixel G1 and the red pixel R1 are overlapped with each other, and these overlapping portions become non-effective regions. Here, the positional (upper and lower) relationships in the overlapping portions depend on the order in which the color filters are formed. The color filters are formed such that parts thereof are overlapped with each other at the border area of the pixels, which results in the above-described configuration.
(49) On the other hand,
(50) As described above, the color filter according to this embodiment includes the first pixel combination 1 in which four green pixels are adjacently arranged. Therefore, there is no overlapping portion of the color filters in the first pixel combination 1, and thereby it is possible to reduce the area of the non-effective region and enhance sensitivity.
(51) Particularly, the green light is a light in a region for which people's eyes have high sensitivity. Therefore, it is possible to enhance luminance data of the obtained image data by improving the sensitivity of the green pixels.
(52) As described above, one optical waveguide is formed for each first pixel combination 1 as the optical waveguide of the green pixels. With this configuration, it is possible to secure large opening sizes of the optical waveguides, and enhance the light collection effect.
(53) [Manufacturing Method of Solid-State Imaging Device]
(54) First, the photodiodes are formed separately for each of the pixels which are arranged in the matrix shape on the light sensing surface of the semiconductor substrate and the signal reading unit for reading the signal charge generated and charged in the photodiodes and the voltage in accordance with the signal charge. Next, the insulating film including the optical waveguides is formed on the semiconductor substrate so as to cover the photodiodes. Then, the color filters are formed on the insulating film, and the on-chip lenses are formed on the color filters.
(55) Here, in the step of forming the color filters, the layout having the first pixel combinations 1 and the second pixel combinations 2 is employed. In the first pixel combination 1, two green pixels are arranged both in the horizontal and vertical directions, and a total of four green pixels are arranged while provided with a green color filter. In the second pixel combination 2, two pixels are arranged both in the horizontal and vertical directions, and a total of four pixels are arranged. In addition, two red pixels having red color filters and two blue pixels having blue color filters are arranged cater cornered. The layout in which the first pixel combinations 1 and the second pixel combinations 2 are alternately arranged both in the horizontal and vertical directions is employed.
(56) As described above, four adjacent green pixels constitute the first pixel combination 1, and no overlapping portion comes about in the first pixel combination 1. Accordingly, it is possible to manufacture the solid-state imaging device capable of reducing the area of the non-effective regions and enhancing sensitivity.
(57) Particularly, it is possible to secure the large opening sizes of the optical waveguides and thereby to enhance the light collection effect by forming one optical waveguide for each first pixel combination 1 as the optical waveguide for the green pixels.
First Modified Example
(58) [Layout of Color Filters and On-Chip Lenses]
(59)
(60) That is, the pixel layout includes the first pixel combinations 1 and the second pixel combinations 2. In the first pixel combination 1, two green pixels (G) are arranged both in the horizontal and vertical directions, and a total of four pixels are arranged while provided with a green color filter. In the second pixel combination 2, two pixels are arranged both in the horizontal and vertical directions, a total of four pixels are arranged, and two red pixels (R) having the red color filters and two blue pixels (B) having the blue color filters are arranged cater cornered. These first pixel combinations 1 and second pixel combinations 2 are alternately arranged in both the horizontal and vertical directions in the layout.
(61) The layout shown in
(62)
(63) One on-chip lens OCL.sub.G is formed for the first pixel combination 1 as the above-described on-chip lens as shown in
(64) In addition, one on-chip lens (OCL.sub.B or OCL.sub.R) is formed for each pixel of the second pixel combination 2 as the on-chip lens. The diameter of one on-chip lens (OCL.sub.B or OCL.sub.R) for each pixel of the second pixel combination 2 is different from that of one on-chip lens OCL.sub.G for the first pixel combination 1.
(65) In addition, as the optical waveguide for the green pixel (G), one optical waveguide is formed for each first pixel combination 1.
(66) On the other hand, as the optical waveguide, one optical waveguide is formed for each pixel of the second pixel combination 2, and one opening size of the optical waveguide for each pixel of the second pixel combination 2 is different from that of the optical waveguide for the first pixel combination 1.
(67) As described above, according to the color filter of this modified example, four adjacent green pixels constitute the first pixel combination 1, and no overlapping portion comes about in the first pixel combination 1. Accordingly, it is possible to reduce the area of the non-effective regions and enhance sensitivity.
(68) Particularly, the green light is a light in a region for which people's eyes have high sensitivity. Therefore, it is possible to enhance luminance data of the obtained image data by improving the sensitivity of the green pixels.
(69) In addition, one optical waveguide is formed for each first pixel combination 1 as the optical waveguide of the green pixels. With this configuration, it is possible to secure large opening sizes of the optical waveguides, and enhance the light collection effect.
Second Modified Example
(70) [Layout of Color Filters and On-Chip Lenses]
(71)
(72) The solid-state imaging device according to this modified example is substantially the same as that of the first embodiment except that the positions of the red pixels (R) having the red color filters are switched with the positions of the blue pixels (B) having the blue color filters.
(73) That is, the pixel layout includes the first pixel combinations 1 and the second pixel combinations 2. In the first pixel combination 1, two green pixels (G) are arranged both in the horizontal and vertical directions, and a total of four pixels are arranged while provided with a green color filter. In the second pixel combination 2, two pixels are arranged both in the horizontal and vertical directions, a total of four pixels are arranged, and two red pixels (R) having the red color filters and two blue pixels (B) having the blue color filters are arranged cater cornered. These first pixel combinations 1 and second pixel combinations 2 are alternately arranged in both the horizontal and vertical directions in the layout.
(74) The portions where the positions of the red pixels (R) are switched with the positions of the blue pixels (B) correspond to part of the layout according to the first embodiment. As shown in
(75) The layout shown in
(76)
(77) One on-chip lens OCL.sub.G is formed for the first pixel combination 1 as the above-described on-chip lens as shown in
(78) In addition, one on-chip lens (OCL.sub.B or OCL.sub.R) is formed for each pixel of the second pixel combination 2 as the on-chip lens. The diameter of one on-chip lens (OCL.sub.B or OCL.sub.R) for each pixel of the second pixel combination 2 is different from that of one on-chip lens OCL.sub.G for the first pixel combination 1.
(79) In addition, as the optical waveguide for the green pixel (G), one optical waveguide is formed for each first pixel combination 1.
(80) On the other hand, as the optical waveguide, one optical waveguide is formed for each pixel of the second pixel combination 2, and one opening size of the optical waveguide for each pixel of the second pixel combination 2 is different from that of the optical waveguide for the first pixel combination 1.
(81) As described above, according to the color filter of this modified example, four adjacent green pixels constitute the first pixel combination 1, and no overlapping portion comes about in the first pixel combination 1. Accordingly, it is possible to reduce the area of the non-effective regions and enhance sensitivity.
(82) Particularly, the green light is a light in a region for which person's eyes have high sensitivity. Therefore, it is possible to enhance luminance data of the obtained image data by improving the sensitivity of the green pixels.
(83) In addition, one optical waveguide is formed for each first pixel combination 1 as the optical waveguide of the green pixels. With this configuration, it is possible to secure large opening sizes of the optical waveguides, and enhance the light collection effect.
Second Embodiment
(84) [Layout of Color Filters and On-Chip Lenses]
(85)
(86) The solid-state imaging device according to this embodiment is substantially the same as that of the first embodiment except for the configurations of the color filters, the on-chip lenses, and the optical waveguides.
(87) The layout of the pixels provided with the color filters includes the first pixel combinations 1, the second pixel combinations 2, and the third pixel combinations 3. In the first pixel combination 1, two green pixels are arranged both in the horizontal and vertical directions, and total of four pixels are arranged while provided with a green color filter. In the second pixel combination 2, two blue pixels are arranged both in the horizontal and vertical directions, and a total of four pixels are arranged while provided with a blue color filter. In the third pixel combination 3, two red pixels are arranged both in the horizontal and vertical directions, and a total of four pixels are arranged while provided with a red color filter. Here, the fourth pixel combination 4 is configured such that the first pixel combinations 1 and the second pixel combinations 2 are alternately arranged in the horizontal direction. In addition, the fifth pixel combination 5 is configured such that the first pixel combinations 1 and the third pixel combinations 3 are alternately arranged in the horizontal direction. In this layout, the fourth pixel combinations 4 and the fifth pixel combinations 5 are alternately arrange in the vertical direction so as to shift by the amount of the first pixel combination 1 in the horizontal direction.
(88) The layout shown in
(89)
(90) One on-chip lens OCL.sub.G is formed for the first pixel combination 1 as the on-chip lens as shown in
(91) In addition, one on-chip lens (OCL.sub.B or OCL.sub.R) is formed for each of the second pixel combination 2 and the third pixel combination 3 as the on-chip lens. The diameter of one on-chip lens (OCL.sub.B or OCL.sub.R) for each of the second pixel combination 2 and the third pixel combination 3 is equal to that of one on-chip lens OCL.sub.G for the first pixel combination 1.
(92) In addition, as the optical waveguide for the green pixel (G), one optical waveguide is formed for each first pixel combination 1.
(93) As the optical waveguides, one optical waveguide is formed for each of the second pixel combination 2 and the third pixel combination 3, and one opening size of the optical waveguide for each of the second pixel combination 2 and the third pixel combination 3 is equal to that of the optical waveguide for the first pixel combination 1.
(94) As described above, according to the color filter of this embodiment, four adjacent green pixels constitute the first pixel combination 1, and no overlapping portion of the color filters comes about in the first pixel combination 1. Accordingly, it is possible to reduce the area of the non-effective regions and enhance sensitivity. In addition, four adjacent blue pixels and four adjacent red pixels respectively constitute the second pixel combination 2 and the third pixel combination 3 in the same manner, and no overlapping portion of the color filters comes about in the second pixel combination 2 and the third pixel combination 3. Accordingly, it is possible to reduce the area of the non-effective regions and enhance sensitivity.
(95) Particularly, the green light is a light in a region for which people's eyes have high sensitivity. Therefore, it is possible to enhance luminance data of the obtained image data by improving the sensitivity of the green pixels.
(96) In addition, one optical waveguide is formed for each of the first pixel combination 1, the second pixel combination 2, and the third pixel combination 3, as the optical waveguides of the green pixels, blue pixels and the red pixels. With this configuration, it is possible to secure large opening sizes of the optical waveguides, and enhance the light collection effect.
(97) [Manufacturing Method of Solid-State Imaging Device]
(98) First, the photodiodes are formed separately for each of the pixels arranged in the matrix shape on the light sensing surface of the semiconductor substrate and the signal reading unit for reading the signal charge generated and charged in the photodiodes and the voltage in accordance with the signal charge. Next, the insulating film including the optical waveguides is formed on the semiconductor substrate so as to cover the photodiodes. Then, the color filters are formed on the insulating film, and the on-chip lenses are formed on the color filters.
(99) Here, in the step of forming the color filters, the layout including the first, second, and third pixel combinations 1, 2, and 3 is employed. In the first pixel combination 1, two green pixels are arranged both in the horizontal and vertical directions as the color filters, and a total of four pixels are arranged while provided with a green color filter as their color filter. In the second pixel combination 2, two blue pixels are arranged both in the horizontal and vertical directions, and a total of four pixels are arranged while provided with a blue color filter. In the third pixel combination 3, two red pixels are arranged both in the horizontal and vertical directions, and a total of four pixels are arranged while provided with a red color filter. The fourth pixel combinations 4, in which the first pixel combinations 1 and the second pixel combinations 2 are alternately arranged in the horizontal direction, and the fifth pixel combinations 5, in which the first pixel combinations 1 and the third pixel combinations 3 are alternately arranged in the horizontal direction, are alternately arranged in the vertical direction so as to shift by the amount of the first pixel combination 1 in the horizontal direction.
(100) In the step of forming the on-chip lenses, one on-chip lens is formed for each of the first, second, and third pixel combinations 1, 2, and 3 as the on-chip lenses.
(101) As described above, four adjacent green pixels constitute the first pixel combination 1, and no overlapping portion comes about in the first pixel combination 1. Accordingly, it is possible to manufacture the solid-state imaging device capable of reducing the area of the non-effective regions and enhancing sensitivity. Particularly, since the adjacent blue pixels and the adjacent red pixels respectively form the second pixel combination 2 and the third pixel combination 3, no overlapping portion comes about in the second pixel combination 2 and the third pixel combination 3. Accordingly, it is possible to reduce the area of the non-effective regions and enhance sensitivity.
(102) In addition, it is possible to secure the large opening sizes of the optical waveguides and thereby to enhance the light collection effect by forming one optical waveguide for each first pixel combination 1 as the optical waveguide for the green pixels. Moreover, it is possible to secure the large opening sizes of the optical waveguides and thereby to enhance the light collection effect by forming one optical waveguide for each four blue pixels and each four red pixels in the same manner.
Third Embodiment
(103) [Configuration in which Floating Diffusion is Shared]
(104)
(105) The solid-state imaging device according to this embodiment is a CMOS image sensor.
(106) The pixel layout and the on-chip lens layout are the same as those in the first embodiment. These may be the same as those in the first modified example, the second modified example, or the second embodiment.
(107) The CMOS image sensor is provided with a MOS transistor which is called a floating diffusion and a source follower for each pixel, which will be described later. Here, in the configuration according to this embodiment, the circuits after the floating diffusion are shared by a plurality of pixels.
(108) As shown in
(109)
(110) For example, a photodiode 33 is formed in a pixel 31, and connected to the floating diffusion FD through a transfer transistor 35.
(111) In the same manner, a photodiode 34 is formed in the other pixel 32, and connected to the floating diffusion FD through the transfer transistor 36.
(112) Control lines (42, 43) connected to the gates of the transfer transistors (35, 36) controls the ON and OFF states of the respective transfer transistors (35, 36), and the signal charge charged in the photodiodes (33, 34) is transferred to the floating diffusion FD.
(113) A gate electrode of an amplification transistor 38, which is called a source follower, is connected to the floating diffusion FD, and the output in accordance with the amount of the electric charge of the floating diffusion FD is output from an output line 40.
(114) In addition, a reset transistor 37 is connected to the floating diffusion FD, and it is possible to remove and reset the signal charge from the floating diffusion after completing the reading by turning on and off the reset transistor 37.
(115) In the above operations, by sequentially executing the transferring by the transfer transistor, the reading of the signal, and the resetting operation for each pixel, it is possible to obtain the signal of each pixel even if the floating diffusion is shared by a plurality of pixels.
(116) [Layout of Photodiodes in Accordance with Sizes of On-Chip Lenses]
(117)
(118) The photodiodes (PD.sub.G, PD.sub.B, PD.sub.R) are connected to the floating diffusion FD through the transfer transistors each having a transfer gate (G.sub.G, G.sub.B, or G.sub.R). One floating diffusion is shared by four pixels.
(119) In the above configuration, the on-chip lens (OCL.sub.B or OCL.sub.R) is formed for each pixel of the blue pixel B and the red pixel R. On the other hand, one on-chip lens OCL.sub.G shared by a pixel combination constituted by four green pixels G is formed for the green pixels G. The on-chip lens has a light collection efficiency which is higher in the center portion thereof and becomes lower in the periphery portion thereof.
(120) It is preferable that the photodiodes (PD.sub.B, PD.sub.R) for the blue pixel B and the red pixel R are respectively provided at the centers of the blue pixel B and the red pixel R.
(121) On the other hand, the photodiodes PD.sub.G for the green pixels G are preferably provided not at the centers of the pixels but at the positions closer to the centers of the on-chip lenses OCL.sub.G as compared with the positions of the photodiodes (PD.sub.B, PD.sub.R) for the blue pixel B and the red pixel R as shown in
(122) When the positions of the photodiodes are different for each pixel as described above, it is preferable to employ in the layout, the floating diffusion FD with a form extending to the photodiodes PD.sub.G for the green pixels G as shown in
(123) Other components are the same as those in the first embodiment. Alternatively, these may be the same as those in the first modified example, the second modified example, or the second embodiment.
(124) As described above, according to the color filters of this embodiment, four adjacent green pixels constitute the first pixel combination 1, and no overlapping portion comes about in the first pixel combination 1. Accordingly, it is possible to reduce the area of the non-effective regions and enhance sensitivity. Particularly, the green light is a light in a region for which people's eyes have high sensitivity. Therefore, it is possible to enhance luminance data of the obtained image data by improving the sensitivity of the green pixels.
(125) In addition, it is possible to secure the large opening sizes of the optical waveguides and thereby to enhance the light collection effect by forming one optical waveguide for each first pixel combination 1 as the optical waveguide for the green pixels.
Fourth Embodiment
(126) [Configuration of Upper Layer Wiring]
(127)
(128) The layout of the pixels and the on-chip lenses are the same as those in the first embodiment. Alternatively, these may be the same as those in the first modified example, the second modified example, or the second embodiment. In addition, when the solid-state imaging device is a CMOS image sensor, the configuration thereof may be the same as that in the third embodiment.
(129) As shown in
(130) The above conductive layer 14 is made of metal which does not transmit light or of polysilicon which has low light permeability. Accordingly, the light incident on the photodiodes for the pixels was not disturbed in the border portions between pixels according to the related art. According to this embodiment, however, four adjacent green pixels constitute the first pixel combination 1, and there is a possibility that the conductive layers disturb the light incident on the positions in which the conductive layers are provided so as to cross over the pixels in the first pixel combinations 1 even if the positions correspond to the border portions of the pixels.
(131) Thus, according to this embodiment, the conductive layer W as the upper layer wiring is positioned so as not to cross over the pixels in each first pixel combination 1 as shown in
(132) With the above configuration, it is possible to reduce the possibility that the conductive layers W as the upper layer wirings disturb the light incident on the photodiodes.
(133) Other components may be the same as those in each embodiment or each modified example.
Fifth Embodiment
(134) [Configuration in which Pluralities of Green Pixels are Arranged in First and Second Directions]
(135)
(136) The green pixels G, the blue pixels B, and the red pixels R are respectively arranged as shown in
(137) Here, a plurality of green pixels G having green color filters are arranged both in the first and second directions as the color filters. The blue pixels B and the red pixels R are entirely surrounded by the green pixels G.
(138) In addition, the on-chip lens shared by a plurality of green pixels is formed as the on-chip lens. The number of pixels may be any number as long as it is more than one. Moreover, the on-chip lens may be separated at appropriate positions.
(139) In addition, the optical waveguide for the green pixels may be formed so as to be communicated with by a plurality of pixels.
(140) According in the solid-state imaging device with the above-described layout, it is possible to enlarge the size of each pixel without degrading the quality of the obtained image data, and to thereby enhance sensitivity.
First Working Example
(141) The solid-state imaging device including the pixels arranged in the layout shown in
(142) In the layout shown in
(143) In the above configuration, the blue signals and the red signals in the green pixels G35, G36, G45, and G46 can be obtained by calculating from the data of the blue signals and the red signals of the pixels, which exist in the periphery of each pixel, as follows:
R36=(R16+R56)/2, R46=(R44+R48)/2
B36=(B34+B38)/2, B46=(B26+B66)/2
R35=(R33+R37)/2, R45=(R25+R65)/2
B35=(B15+B55)/2, B45=(B43+B47)/2[Equation 1]
(144) The green signals in the red pixels R33 and R44 and the blue pixels B34 and B43 can be obtained by calculating from the date of the green signals of the pixels, which exist in the periphery of each pixel.
G34=(G14+G54+G32+G36)/4, G44=(G24+G64+G42+G46)/4
G33=(G13+G53+G31+G35)/4, G43=(G23+G63+G41+G45)/4[Equation 2]
Second Working Example
(145) The solid-state imaging device including the pixels arranged in the layout shown in
(146) In the layout shown in
(147) In the above configuration, the blue signals and the red signals in the green pixels G35, G36, G45, and G46 can be obtained by calculating from the data of the blue signals and the red signals of the pixels, which exist in the periphery of each pixel, as follows:
R36=(R34+R38)/2, R46=(R26+R66)/2
B36=(B16+B56)/2, B46=(B44+B48)/2
R35=(R15+R55)/2, R45=(R43+R47)/2
B35=(B33+B37)/2, B45=(B25+B65)/2[Equation 3]
(148) In addition, the green signals in the blue pixels B33 and B44 and the red pixels R34 and R43 can be obtained by calculating from the data of the green signals of the pixels, which exist in the periphery of each pixel.
G34=(G14+G54+G32+G36)/4, G44=(G24+G64+G42+G46)/4
G33=(G13+G53+G31+G35)/4, G43=(G23+G63+G41+G45)/4[Equation 4]
Third Working Example
(149) The solid-state imaging device including the pixels arranged in the layout shown in
(150) In the layout shown in
(151) In the above configuration, the blue signals and the red signals in the green pixels G35, G36, G45, and G46 can be obtained by calculating from the data of the blue signals and the red signals of the pixels, which exist in the periphery of each pixel, as follows:
B36=(B25+B47)/2, R46=(R37+R55)/2
R35=(R26+R44)/2, B45=(B34+B56)/2[Equation 5]
(152) In addition, the green signals in the red pixels R33 and R44 and the blue pixels B34 and B43 can be obtained by calculating from the data of the green signals of the pixels, which exist in the periphery of each pixel.
G34=(G14+G54+G32+G36)/4, G44=(G24+G64+G42+G46)/4
G33=(G13+G53+G31+G35)/4, G43=(G23+G63+G41+G45)/4[Equation 6]
Fourth Working Example
(153) The solid-state imaging device including the pixels arranged in the layout shown in
(154) In the layout shown in
(155) In the above configuration, the blue signals and the red signals in the green pixels G35, G36, G45, and G46 can be obtained by calculating from the data of the blue signals and the red signals of the pixels, which exist in the periphery of each pixel, as follows:
R36=(R16+R56)/2, R46=(R44+R48)/2
B36=(B34+B38)/2, B46=(B26+B66)/2
R35=(R33+R37)/2, R45=(R25+R65)/2
B35=(B15+B55)/2, B45=(B43+B47)/2[Equation 7]
(156) In addition, the green signals in the blue pixels B33, B34, B43 and B44 can be obtained by calculating from the data of the green signals of the pixels, which exist in the periphery of each pixel.
G34=(G14+G54+G32+G36)/4, G44=(G24+G64+G42+G46)/4
G33=(G13+G53+G31+G35)/4, G43=(G23+G63+G41+G45)/4[Equation 6]
Sixth Embodiment
(157) [Camera Employing Solid-State Imaging Device]
(158)
(159) This camera is provided with a solid-state imaging device 50 in which a plurality of pixels are integrated, an optical system 51, and a signal processing circuit 53.
(160) In this embodiment, the above solid-state imaging device 50 includes the solid-state imaging device according to any one of the first to fifth embodiments and the first and second modified examples.
(161) The optical system 51 forms an image on the imaging surface of the solid-state imaging device 50 by the imaging light (incident light) from the object. As a result, the photodiode constituting each pixel on the imaging surface of the solid-state imaging device 50 converts the imaging light into the signal charge in accordance with the incident light amount, and charges the signal charge for a predetermined time period.
(162) The charged signal charge is output as an output signal Vout through a CCD charge channel, for example.
(163) The signal processing circuit 53 subjects the output signal Vout of the solid-state imaging device 50 to various signal processings, and outputs as a video signal.
(164) According to the above-described camera of this embodiment, it is possible to improve the color shading characteristic and the dispersion characteristic without lowering the light collection ratio of the obliquely incident light and enhance sensitivity. Moreover, it is possible to form the microlens with a simple method and simple processes.
(165) The embodiments of the present invention are not limited to the above description.
(166) For example, the embodiments can be applied to both the CMOS sensor and the CCD element.
(167) In addition, various modifications can be made without departing from the scope of the present invention.
(168) The present application contains subject matter related to that disclosed in Japanese Priority Patent Application JP 2009-173127 filed in the Japan Patent Office on Jul. 24, 2009, the entire content of which is hereby incorporated by reference.
(169) It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may occur depending on design requirements and other factors insofar as they are within the scope of the appended claims or the equivalents thereof.