Gaze tracking apparatus

10437329 · 2019-10-08

Assignee

Inventors

Cpc classification

International classification

Abstract

Provided are gaze tracking apparatuses, which in some embodiments can include an optoelectronic device, wherein the optoelectronic device includes an image sensor with non-local readout circuit having a substrate and a plurality of pixels and operatively connected to a control unit, wherein a first area of the substrate is at least partially transparent to visible light and at least the plurality of pixels of the image sensor are arranged on the first area of the substrate to aim to an eye of a user when placed in front of an inner face of the substrate, and wherein the control unit is also adapted to control the image sensor to acquire image information from the user's eye for performing a gaze tracking of the user's eye.

Claims

1. A gaze tracking apparatus, comprising an optoelectronic device, wherein the optoelectronic device comprises: a substrate having a first area that is at least partially transparent to visible light, a plurality of photodetectors arranged on said first area of the substrate to aim to an eye of a user when placed in front of an inner face of said substrate, and a control unit operatively connected to the plurality of photodetectors to at least receive output signals supplied from each of the photodetectors when light impinges thereon, wherein the control unit is also adapted to perform a gaze tracking of said eye based on the output signals from the photodetectors, and further wherein the control unit comprises: a first biasing circuit for providing a first biasing voltage; a second biasing circuit for providing a second biasing voltage, the second biasing voltage being substantially symmetrical to the first biasing voltage with respect to a voltage reference; and a non-local readout circuit for reading out a photo-signal generated by light impinging on a plurality of pixels; wherein the first biasing circuit and the second biasing circuit comprise, respectively, first selection means and second selection means to selectively bias one or more pixels of said plurality of pixels that are to be read out at a given time, the first selection means and the second selection means being arranged outside the first area of the substrate; and wherein each pixel of the plurality of pixels comprises: a photo-active element comprising a photosensitizing layer associated to a transport layer, the transport layer including at least one layer of a two-dimensional material; a non-photo-active reference element disposed proximate to the photo-active element, the non-photo-active reference element having a dark conductance that substantially matches a dark conductance of the photo-active element; a first contact circuitally connected to the first biasing circuit; a second contact circuitally connected to the second biasing circuit; and an output contact circuitally connected to the non-local readout circuit; wherein the photo-active element is circuitally connected between the first contact and the output contact, and the non-photoactive reference element is circuitally connected between the output contact and the second contact; wherein the optoelectronic device comprises an image sensor with the non-local readout circuit, wherein said image sensor comprises said substrate and the plurality of pixels arranged on the first area of the substrate, said plurality of pixels comprising said plurality of photodetectors, the plurality of photodetectors comprising the photo-active elements and having a built-in photoconductive gain; and wherein the control unit is adapted to selectively bias said plurality of pixels and read them out by means of the non-local readout circuit comprised by the control unit, said non-local readout circuit being arranged outside the first area of the substrate, and wherein the control unit is also adapted to control the image sensor to acquire image information from said eye for performing said gaze tracking of said eye.

2. The gaze tracking apparatus of claim 1, wherein said control unit comprises a processing unit and associated electric and electronic circuitry, including readout electronics, and that is operatively connected to or includes said non-local readout circuit, for receiving and processing said acquired image information to perform said gaze-tracking of said eye.

3. The gaze tracking apparatus of claim 2, wherein at least part of said control unit, including said readout electronics, is arranged on an area of the substrate that is outside the first area of the substrate and/or on an area of another substrate.

4. The gaze tracking apparatus of claim 3, wherein said area of the substrate or of said another substrate where said at least part of the control unit is arranged, is an area that is non-transparent to visible light.

5. The gaze tracking apparatus of claim 3, wherein the optoelectronic device is a wearable device, wherein said substrate is, or comprises, or is attached to, or embedded in, an eyeglass, lens, or visor of said wearable device that stands in front of the user's eye when the user wears the wearable device, and wherein said at least part of the control unit is arranged out of said eyeglass, lens, or visor.

6. The gaze tracking apparatus of claim 5, wherein said wearable device is one of an eyeglasses and a goggles, comprising at least said eyeglass or lens.

7. The gaze tracking apparatus of claim 6, wherein said at least part of the control unit arranged out of the eyeglass or lens is arranged in a frame of the eyeglasses.

8. The gaze tracking apparatus of claim 5, wherein said wearable device is a helmet comprising said visor.

9. The gaze tracking apparatus of claim 3, wherein said substrate is, or comprises, or is attached to, or embedded in, a panel that is at least partially transparent to visible light.

10. The gaze tracking apparatus of claim 9, wherein said panel is at least one of a window and a screen.

11. The gaze tracking apparatus of claim 10, further comprising a computing device that includes said screen.

12. The gaze tracking apparatus of claim 1, wherein the optoelectronic device is made and arranged to position, in use, the first area of the substrate in front of said eye of the user when placed in front of said inner face of the substrate, so that the user can see through the first area of the substrate.

13. The gaze tracking apparatus of claim 1, made and arranged to operate under passive illumination, wherein the image sensor is made and arranged to acquire said image information from a portion of ambient light that is reflected off of the user's eye.

14. The gaze tracking apparatus of claim 1, further comprising an active illumination unit comprising at least one light source that is operatively connected to said control unit, and made and arranged to emit light, in use, towards said user's eye under the control of said control unit, and wherein the image sensor is made and arranged to acquire said image information from light that is emitted from the at least one light source and is reflected off of the user's eye.

15. The gaze tracking apparatus of claim 14, wherein said at least one light source and said image sensor operate in an eye-safe short-wave infrared range.

16. The gaze tracking apparatus of claim 15, comprising a light filter for blocking ambient light, wherein said light filter is arranged on or attached to an outer face of the substrate, opposite to said inner face, or embedded in a surface between the plurality of photodetectors and said outer face of the substrate.

17. The gaze tracking apparatus of claim 1, further comprising lenses covering the photo-active elements of the image sensor, wherein said lenses are at least partially transparent to visible light.

18. The gaze tracking apparatus of claim 17, further comprising electrically conductive traces for electrically connecting at least the control unit and the image sensor, wherein said electrically conductive traces are arranged at least in part on the first area of the substrate, or of another substrate at least partially transparent to visible light and that is attached to or embedding said substrate.

19. The gaze tracking apparatus of claim 18, wherein said electrically conductive traces are at least partially transparent to visible light.

20. The gaze tracking apparatus of claim 18, wherein said electrically conductive traces are opaque to visible light, sufficiently thin and distributed through the first area of the substrate with such a separation between them to allow a user to see through the first area of the substrate.

Description

BRIEF DESCRIPTION OF THE FIGURES

(1) In the following some preferred embodiments of the invention will be described with reference to the enclosed Figures. They are provided only for illustration purposes without however limiting the scope of the invention.

(2) FIG. 1 is a schematic block diagram of an exemplary image sensor according to the present invention.

(3) FIGS. 2a and 2b correspond to a bottom plan view and a cross-sectional view of a pixel for the image sensor of FIG. 1, in which the first, second and output contacts of the pixel are disposed below the transport layer of the photo-active element and the transport layer of the reference element of the pixel.

(4) FIGS. 3a and 3b show, in a bottom plan view and a cross-sectional view, an alternative pixel layout for the image sensor of FIG. 1, in which the first, second and output contacts of the pixel are disposed above the transport layer of the photo-active element and the transport layer of the reference element of the pixel.

(5) FIG. 4 depicts a cross-sectional view of a pixel for an image sensor according to the present invention, in which the transport layer of the reference element of the pixel has a smaller area than the transport layer of the photo-active element of said pixel.

(6) FIG. 5 corresponds to a cross-sectional view of a pixel suitable for an image sensor in accordance with the present invention, in which the reference element of the pixel is arranged below the photo-active element of said pixel.

(7) FIGS. 6a and 6b are a bottom plan view and a cross-sectional view of a pixel for an image sensor according to the present invention, in which the pixel comprises a back-gate contact below the photo-active element.

(8) FIGS. 7a and 7b depict a bottom plan view and a cross-sectional view of another pixel for an image sensor according to the present invention, in which the pixel comprises a back-gate contact below each of the photo-active element and the reference element.

(9) FIG. 8a shows a schematic block diagram of an embodiment of an image sensor according to the present invention in which the readout circuit comprises a multiplexer followed by an amplifier and a storage element cascaded thereto.

(10) FIG. 8b is a schematic block diagram of another embodiment of an image sensor according to the present invention in which the readout circuit comprises as many amplifiers as there are pixels in each row of the array of pixels and a storage element connected in series to the output node of each amplifier.

(11) FIG. 9 is a detailed representation, in a cross-sectional view, of area A in FIG. 1 in which it is illustrated the crossing of different conductive traces.

(12) FIGS. 10a-10g depict the different steps in the process of fabrication of a pixel of the image sensor of FIG. 1.

(13) FIG. 11 is a schematic representation of an exemplary image sensor in which its pixels are grouped into clusters, each cluster being sensitive to a different range of the spectrum.

(14) FIG. 12 shows a block diagram of an optoelectronic device in accordance with an embodiment of the present invention.

(15) FIGS. 13a and 13b are, respectively, a side view and a plan view of the image sensor of the present invention for an embodiment for which a light-concentrating structure is arranged on top of the image sensor.

(16) FIGS. 14a and 14b are, respectively, a side view and a plan view of the image sensor of the present invention for an embodiment for which a micro-lens is arranged on top of each pixel.

(17) FIG. 15 is a plot showing the normalized spectral response of three different pixels of the image sensor of the present invention for an embodiment for which the image sensor is capable of multispectral response by the inclusion therein of pixels with photosensitizing layers sensitive to different ranges of the light spectrum, comprising quantum dots (QD) having different sizes (per pixel), where the curves relate to Short-wave Infrared (SWIR), Near Infrared (NIR) and visible light (VIS).

(18) FIG. 16 shows several curves representative of data obtained from a pixelated detector built according to the image sensor of the present invention where the light is transmitted through a diffractive optics system before it hits the pixelated detector. Each curve corresponds to data obtained when the combined system (diffractive optics coupled to the pixelated detector according to the present invention) is illuminated with light of a specific wavelength (corresponding to the wavelength where the maximum in each curve occurs).

(19) FIG. 17 shows several curves representative of data extracted from a 4-pixel photodetector linear array according to the image sensor of the present invention, arranged on a flexible and transparent substrate.

(20) FIG. 18 schematically shows a gaze tracking principle, according to a conventional gaze tracking apparatus and approach.

(21) FIG. 19 schematically shows part of the gaze tracking apparatus of the present invention (some elements have been omitted, such as the control unit, electrically conductive traces, etc.), for an embodiment, and the operation thereof to achieve an improved gaze tracking process.

(22) FIG. 20 schematically shows part of the gaze tracking apparatus of the present invention, for another embodiment, and the operation thereof, for a virtual reality application.

(23) FIG. 21 schematically shows part of the gaze tracking apparatus of the present invention, where the optoelectronic device is a wearable device, particularly an eyeglasses, for four embodiments, differing in that the first area of the substrate (i.e. that including the detector array or pixels of the image sensor), occupies part or the entire glass or glasses of the eyeglasses,

(24) FIG. 22 schematically shows part of the gaze tracking apparatus of the invention, for different embodiments for which the apparatus comprises a computing device, such as a computer monitor (left) or a mobile phone screen (right), and the first area of the substrate covers the screen of the phone partially (top views) or entirely (bottom views).

(25) FIG. 23 schematically shows part of the gaze tracking apparatus of the invention, applied to a window, where the first area of the substrate covers the window partially (left view) or entirely (right view).

(26) FIG. 24 schematically shows part of the gaze tracking apparatus of the present invention, for another embodiment for which a light filter is applied to an outer face of the substrate 8 in this case a lens), and the apparatus operates in a SWIR range. The drawing also shows the different light paths involved in the operation, where there have been separated visible (<700 nm) and NIR, SWIR and MIR light (>.sub.co).

(27) FIGS. 25 and 26 show, for two different embodiments of the apparats of the present invention, schematic diagrams of the electronics and photodetector array circuitry of the optoelectronic device thereof. The glass (transparent) part of the spectacles only contains transparent conductive traces and photodetectors. All the active control and read-out components are located outside of the glass part of the spectacles (for example in the frame). The diagrams of FIG. 25 shows a 3-terminal read-out scheme implementation (similar to those of FIGS. 8a and 8b) and the bottom diagrams of FIG. 26 show a 2-terminal implementation (the one shown in FIG. 1).

DETAILED DESCRIPTION

(28) In FIG. 1 it is illustrated a schematic block diagram of an embodiment of an image sensor with non-local readout circuit according to present invention. The image sensor 100 comprises a plurality of pixels 101 arranged as a two-dimensional array of M rows and N columns on a first area 102a of a substrate 102. In particular, FIG. 1 corresponds to a bottom plan view of the image sensor 100, that is, as seen through the substrate 102.

(29) The image sensor 100 further comprises a control unit operatively connected to the plurality of pixels 101 and adapted to selectively bias said pixels and read them out. The control unit comprises a first biasing circuit 103a for providing a first biasing voltage V.sub.DD, a second biasing circuit 103b for providing a second biasing voltage V.sub.SS, the second biasing voltage V.sub.SS being substantially symmetrical to the first biasing voltage V.sub.DD, and a readout circuit 104 for reading out the photo-signal generated by the light impinging on the pixels 101. The control unit also includes a plurality of output nodes 111 operatively connected to the readout circuit 104.

(30) The first biasing circuit 103a and the second biasing circuit 103b comprise, respectively, first selection means 105a and second selection means 105b to selectively bias one or more pixels 101 of said plurality that are to be read out at a given time. The first selection means 105a and the second selection means 105b are arranged outside the first area 102a of the substrate 102 and, as illustrated in the example of FIG. 1, comprise a plurality of switches (implemented as gate-controlled transistors).

(31) Each pixel 101 of the plurality of pixels comprises a photo-active element 106 and a non photo-active reference element 107 disposed proximate to the photo-active active element 106. Moreover, each pixel 101 further comprises a first contact 108a circuitally connected to the first biasing circuit 103a, a second contact 108b circuitally connected to the second biasing circuit 103b, and an output contact 109 circuitally connected to the readout circuit 104.

(32) The photo-active element 106 is circuitally connected between the first contact 108a and the output contact 109, while the reference element 107 is circuitally connected between the output contact 109 and the second contact 108b. The reference element 107 has a dark conductance that substantially matches the dark conductance of the photo-active element 106, making it possible to substantially suppress the dark current generated in the photo-active element 106 during the exposure cycle.

(33) As it can be seen in greater detail in the cross-sectional view of FIG. 2b, the photo-active element 106 comprises a photosensitizing layer 201 associated to a transport layer 202 that includes at least one layer of a two-dimensional material. Similarly, the reference element 107 also comprises a photosensitizing layer 203 associated to a transport layer 204 that includes at least one layer of a two-dimensional material.

(34) In this example the photosensitizing layer 201 of the photo-active element 106 and the photosensitizing layer 203 of the reference element 107 are disposed above (and, in particular, directly above) the transport layer 202 and 204 respectively. However, in other examples the photosensitizing layer of the photo-active element or that of the reference element can be disposed below its corresponding transport layer.

(35) The image sensor 100 further comprises first conductive traces 110a and second conductive traces 110b that connect the first biasing circuit 103a and the second biasing circuit 103b with, respectively, the first contact 108a and the second contact 108b of the pixels. In the example of the FIG. 1, said first and second conductive traces 110a, 110b extend horizontally across the first area 102a of the substrate from the first and second biasing circuits 103a, 103b located on the leftmost and rightmost portions of the substrate 102, outside the first area 102a.

(36) Additionally, the image sensor 100 also comprises third conductive traces 110c (which in FIG. 1 extend along the vertical direction) that connect the output contacts 109 of the pixels in a daisy-chain configuration with the readout circuit 104, which is arranged in the uppermost portion of the substrate 102, outside said first area 102a.

(37) The substrate 102 is made of a flexible and transparent material, such as for example PET or PEN. In addition, the first contact 108a, the second contact 108b and the output contact 109 of the pixels 101, and said conductive traces 110a, 110b, 110c, are made of a transparent conducting oxide, such as for instance ITO.

(38) In the image sensor 100, the first biasing circuit 103a, the second 103b, and the readout circuit 104 (the three of them being comprised in the control unit of the image sensor 100) are arranged on a second area 102b located on the periphery of the same substrate 102, hence not overlapping the first area 102a on which the plurality of pixels 101 are arranged. However, in other examples, the control unit may be arranged on different substrate provided in the image sensor.

(39) Referring now to FIGS. 2a and 2b, it is shown the layout of a pixel 101 of the image sensor 100 in which the photo-active element 106 is arranged next to the reference element 107 on a same level. In this example, the transport layer 202 of the photo-active element 106 and the transport layer 204 of the reference element 107 are coplanar. The first contact 108a and the output contact 109 (at opposite ends of the photo-active element 106) are disposed below the transport layer 202, while the second contact 108b and the output contact 109 (at opposite ends of the reference element 107) are disposed below transport layer 204.

(40) The reference element 107 further comprises a first light-blocking layer 205 disposed above the photosensitizing layer 203 and the transport layer 204, and a second light-blocking layer 206 disposed below said photosensitizing layer 203 and said transport layer 204. In particular, the first light-blocking layer 205 is disposed directly above the photosensitizing layer 203, while the second light-blocking layer 206 is separated from the transport layer 204 by an insulating layer 207. The first and second light-blocking layers 205, 206 are passivation layers comprising an oxide.

(41) FIGS. 3a and 3b depict an alternative example of a pixel layout than can be used in the image sensor 100 of FIG. 1. For simplicity, elements in common with the pixel structure of FIGS. 2a and 2b have been labeled with the same reference numerals. The photo-active element 106 is circuitally connected between a first contact 308a and an output contact 309, while the reference element 107 is circuitally connected between the output contact 309 and a second contact 308b. Conversely to the case illustrated in FIGS. 2a and 2b, now the first contact 308a and the output contact 309 are disposed above the transport layer 202, more specifically between said transport layer 202 and the photosensitizing layer 201. In the same way, the second contact 308b and the output contact 309 are disposed above the transport layer 204, between said transport layer 204 and the photosensitizing layer 203.

(42) The transport layers 202, 204 are spaced from the substrate 102 by means of an insulating layer 307, which provides mechanical support for the deposition of the output contact 309 in the region between said transport layers 202, 204.

(43) Referring back to FIG. 1, it can be observed that the first and second conductive traces 110a, 110b cross at a number of places the third conductive traces 110c. To avoid the electrical contact between different conductive traces, the third conductive traces 110c are raised to pass above the first and second conductive traces 110a, 110b. An intervening insulating layer further prevents the electrical contact between traces. For this reason, the third conductive traces 110c advantageously comprise a vertical portion (such as a via) through the intervening insulating layer to make ohmic connection with the output contact 109 of the pixels.

(44) One of such crossings, in particular the one occurring in region A of the image sensor of FIG. 1, is illustrated in the cross-sectional view of FIG. 9, in which a third conductive trace 110c crosses above a second conductive trace 110b, both traces being spaced by an intervening insulating layer 900. The third conductive trace 110c comprises a vertical portion 901 that goes through the intervening insulating layer 900 to reach the level on which the output contact 109 of the pixels are arranged.

(45) Alternatively, the pixels of the image sensor may advantageously have the first and second contacts 108a, 108b disposed below the transport layers 202, 204, and the output contact 109 disposed above the transport layers 202, 204. In this case, as the first and second conductive traces 110a, 110b will always be below the third conductive traces 110c, electrical contact between traces is avoided. Moreover, the third conductive traces 110c might no longer require vertical portions to make ohmic connection with the output contact 109 of the pixels. Nevertheless, even in this case, it is still preferred to have an intervening insulating layer to further isolate the first and second conductive traces from the third conductive traces.

(46) Referring now to FIG. 4, it is there shown in a cross-sectional view another example of a pixel suitable for an image sensor in accordance with the present invention. In particular, a pixel 401 is arranged on a substrate 400 and comprises photo-active element 402 and a reference element 403 disposed one next to the other in a coplanar configuration. The photo-active element 402 is circuitally connected between a first contact 410a and an output contact 409, while the reference element 403 is circuitally connected between the output contact 409 and a second contact 410b. Moreover, an insulating layer 413 has been provided on the substrate 400, below photo-active element 402 and a reference element 403.

(47) The photo-active element 402 comprises a photosensitizing layer 405 associated to a transport layer 406, which is disposed below the photosensitizing layer 405 and includes at least one layer of a two-dimensional material. Likewise, the reference element 403 also comprises a photosensitizing layer 407 associated to another transport layer 408, which is disposed below the photosensitizing layer 407 and includes at least one layer of a two-dimensional material. The first contact 410a, second contact 410b, and output contact 409 are sandwiched between the photosensitizing layers 405, 407 and the transport layers 406, 408.

(48) In this example, the transport layer 408 of the reference element has a smaller area than the transport layer 406 of the photo-active element, advantageously reducing the overhead in real estate due to the presence of the reference element 403 in the pixel 401. Despite being smaller in size, the transport layer 408 has the same shape as the transport layer 406 in order to ensure that the dark conductance of the reference element 403 substantially matches the dark conductance of the photo-active element 402.

(49) Finally, as in the previous examples, the reference element 403 also comprises a first light-blocking layer 411 disposed above the photosensitizing layer 407 and a second light-blocking layer 412 disposed below the transport layer 408, so that the absorption of the incident light in the reference element 403 is prevented.

(50) A further example of a pixel suitable for an image sensor according to the invention is depicted in FIG. 5, in which a pixel 501 is disposed on a substrate 500 and comprises a reference element 503 arranged below a photo-active element 502, resulting in very compact architecture with reduced footprint.

(51) The photo-active element 502 comprises a photosensitizing layer 504 disposed above a transport layer 505. Below the photo-active element 502, the reference element 503 also comprises a photosensitizing layer 506 disposed above another transport layer 507. A primary insulating layer 512 associated to the photo-active element 502 is arranged between the photo-active element 502 and the reference element 503, to provide isolation between the two elements.

(52) The reference element 503 comprises a first light-blocking layer 511 disposed above the its photosensitizing layer 506 and a second light-blocking layer 510 disposed below the transport layer 507, separated from said transport layer 507 by means of secondary insulating layer 513.

(53) The way of contacting the photo-active element 502 and the reference element 503 is somewhat different from what it has been described above for the previous examples. A first contact 508a and a second contact 508b are provided at different levels on a same side of the pixel (namely, on the right-hand side in FIG. 5) and are circuitally connected to a first end of the photo-active element 502 and of the reference element 503 respectively.

(54) On the opposite side of the pixel 501 (on the left-hand side in the Figure), a common output contact 509 is circuitally connected to a second end of the photo-active element 502 and of the reference element 503. The output contact 509 comprises a vertical portion that extends from the transport layer 505 of the photo-active element to the transport layer 507 of the reference element.

(55) The geometry of the photo-active elements of the previous examples can be defined via patterning of the transport layer, which allows either maximizing the light-collection area or tailoring specific aspect ratios for the optimization of different performance parameters (such as for instance, but not limited to, noise, responsivity, and resistance).

(56) FIGS. 6a-6b and 7a-7b represent two pixel configurations based on the example already discussed in the context of FIGS. 3a-3b in which the pixel additionally comprises back-gate contacts.

(57) In the example of FIGS. 6a-6b, the pixel 601 comprises back-gate contact 600 disposed below the photo-active element 106, between the insulating layer 307 and the substrate 102. In this case, the insulating layer 307 is as a primary insulating layer associated to the photo-active element 106 which, together with the back-gate contact 600, allows finely controlling the conduction and photosensitivity of said photo-active element 106.

(58) As shown in FIG. 6a, the pixel 601 is a four-terminal device having the first contact 308a and the second contact 308b adapted to be circuitally connected, respectively, to first and second biasing circuits providing substantially symmetrical first and second biasing voltages V.sub.DD, V.sub.SS; the output contact 309 adapted to be circuitally connected to a readout circuit to deliver the photo-signal V.sub.OUT generated at the pixel; and the back-gate contact 600 to provide a gating voltage V.sub.GATE to the photo-active element 106.

(59) FIGS. 7a-7b show another example of a pixel comprising back-gate contacts. The pixel 701 has a layout similar to that of pixel 601, but differs in that it comprises not only a back-gate contact 700 disposed below the photo-active element 106 (between the insulating layer 307 and the substrate 102) but also an additional back-gate contact 702 disposed below the reference element 107. Said additional back-gate contact 702 is arranged between the insulating layer 307 and the second light-blocking layer 206.

(60) Now, the insulating layer 307 is, at the same time, a primary insulating layer associated to the photo-active element 106 but also a secondary insulating layer associated to the reference element 107. Although in this particular example the primary and secondary insulating layers are embodied as a same insulating layer, in other examples they can be different layers arranged at a same or different levels in the layout structure of the image sensor.

(61) The resulting pixel 701 can be operated as a five-terminal device in which its first and second contacts 308a, 308b are adapted to be circuitally connected, respectively, to first and second biasing circuits providing first and second biasing voltages V.sub.DD, V.sub.SS and its output contact 309 is adapted to be circuitally connected to a readout circuit to deliver the photo-signal V.sub.OUT generated at the pixel 701. Additionally, the back-gate contact 700 is conFigured to provide a gating voltage V.sub.GATE1 to the photo-active element 106 to fine-tune, for example, its photosensitivity, while the back-gate contact 702 is adapted to provide a gating voltage V.sub.GATE2 to the reference element 107 to adjust its conductance.

(62) Although in these examples the pixels 601, 701 are provided with back-gate contacts only, in other examples they may comprise, additionally or alternatively, top-gate contacts.

(63) Referring now to FIG. 8a, it is there shown, in a bottom plan view, the block diagram of an image sensor of the present invention. The image sensor 800 comprises a plurality of pixels 801 arranged as a two-dimensional array comprising a plurality of rows, each comprising the same number of pixels, aligned defining a plurality of columns. The plurality of pixels 801 are arranged on a first area 802 of a substrate (not depicted in the Figure).

(64) The image sensor 800 comprises a control unit operatively connected to the plurality of pixels 801, which includes a first biasing circuit 803a for providing a first biasing voltage V.sub.DD, a second biasing circuit 804b for providing a second biasing voltage V.sub.SS, and a readout circuit 804. In particular, the second biasing voltage V.sub.SS is substantially symmetrical to the first biasing voltage V.sub.DD.

(65) The first biasing circuit 803a and the second biasing circuit 803b comprise, respectively, first row-select switches 805a and second row-select switches 805b to selectively bias the rows of the array.

(66) The first and second row-select switches 805a, 805b make it possible to sequentially enable only one row of the array at a time while leaving the other rows disabled, which allows to daisy-chain the pixels 801 of each column of the array to the readout circuit 804, as it can be observed in FIG. 8a. This greatly simplifies the interconnection of the pixels 801 to the readout circuit 804 and reduces the power consumption of the image sensor 800 during operation.

(67) Each pixel 801 comprises a photo-active element 809 circuitally connected between a first contact 811a and an output contact 812, and a reference element 810 circuitally connected between the output contact 812 and a second contact 811b. The structure of the pixels 801 is the same as the one for the pixels 101, which has already been described in detail above in the context of the image sensor 100 in FIG. 1.

(68) The image sensor 800 further comprises first conductive traces 815a and second conductive traces 815b that connect the first biasing circuit 803a and the second biasing circuit 803b with, respectively, the first contact 811a and the second contact 811b of the pixels.

(69) The readout circuit 804 includes a multiplexer 806 (depicted as a plurality of switches) that comprises as many input terminals 813 as there are pixels 801 in each row and an output terminal 814. Each input terminal 813 is circuitally connected to the output contact 812 of a pixel of each row (in particular the pixels forming a column) by means of third conductive traces 815c provided in the image sensor 800.

(70) The readout circuit further comprises an amplifier 807 operatively connected in series to the output terminal 814 of the multiplexer, and a storage element 808 operatively connected in series to the amplifier 807 and conFigured to store a voltage proportional to the photo-signal generated in a pixel 801 of the plurality of pixels.

(71) Upon readout, the control unit activates only one first row-select switch 805a and only one second row-select switch 805b at a time, biasing with balanced voltages only one row of pixels 801 of the array, while the pixels in the other rows remain disabled.

(72) In this manner, only the pixels 801 in the selected row load the input terminals 813 of the multiplexer 806. This makes it possible for a pixel 801 of the selected row to be connected to the corresponding input terminal 813 of the multiplexer 804 by means of the output contacts 812 of the other pixels arranged in the same column as said pixel, and the third conductive traces 815c connecting said output contacts 812. Then, the photo-signal generated in each pixel 801 of the selected row can reach the readout circuit 804 without being disturbed by the pixels in the other rows.

(73) FIG. 8b shows another example of an image sensor that is similar in topology to the one just described in the context of FIG. 8a but with an alternative readout circuit design. The image sensor 850 comprises a plurality of pixels 851 arranged on a first area 852 of a substrate and operatively connected to a control unit that includes a first and a second biasing circuits 853a, 853b circuitally connected, respectively, to first and second contacts 861a, 861b of each pixel, and a readout circuit 854 circuitally connected to an output contact 862 of each pixel. The structure of the pixels, and that of the first and second biasing circuits of the image sensor 850 are similar to the ones comprised in the image sensor 800 and already described above.

(74) The readout circuit 854 comprises as many amplifiers 857 as there are pixels 851 in each row, that is, the readout circuit 854 comprise an amplifier 857 for each column. Each amplifier 857 has an input terminal 863, circuitally connected to the output contact of a pixel 851 of each row, and an output terminal 864. In addition, the readout circuit 854 also comprises a storage element 858 that is connected in series to the output terminal 864 of each amplifier and conFigured to store a voltage proportional to the photo-signal generated in the pixels.

(75) Additionally, the control unit of the image sensor 850 includes an interconnection circuit 866 (a multiplexer in the example of FIG. 8b), operatively connected to the readout circuit 854 and that comprises an output node 867. The interconnection circuit 866 allows to circuitally connect, through the readout circuit 854, the output contact 862 of any of the pixels of the array with the output node 867.

(76) The image sensor 100 with non-local readout circuit described above in the context of FIGS. 1, 2a and 2b can be manufactured by means of a method that comprises the steps of:

(77) a) providing a transport layer 202 including at least one layer of a two-dimensional material, and a photosensitizing layer 201 associated to a transport layer 201, on a first area 102a of a substrate 102;

(78) b) providing a first biasing circuit 103a, a second biasing circuit 103b and a readout circuit 104 in the control unit, the first biasing circuit 103a providing a first biasing voltage V.sub.DD, the second biasing circuit 103b providing a second biasing voltage V.sub.SS substantially symmetrical to the first biasing voltage, and the readout circuit 104 being adapted to read out the photo-signal generated by the light impinging on the pixels 101;

(79) c) arranging first selection means 105a and second selection means 105b provided, respectively, in the first biasing circuit 103a and the second biasing circuit 103b outside the first area 102a of the substrate, the first selection means 105a and second selection means 105b being adapted to selectively bias one or more pixels 101 of said plurality that are to be read out at a given time;

(80) For each pixel 101 of the plurality of pixels, the method further comprises:

(81) d) defining a photo-active element 106 at a selected location of the transport layer 202 and the photosensitizing layer 201 arranged on the first area 102a of the substrate, and circuitally connecting the photo-active element 106 between a first contact 108a and an output contact 109 provided in said pixel 101;

(82) e) arranging a non photo-active reference element 107 proximate to the photo-active active element 106 of said pixel, the reference element 107 having a dark conductance that substantially matches the dark conductance of the photo-active element 106, and circuitally connecting the reference element 107 between said output contact 109 and a second contact 108b provided in said pixel 101;

(83) f) circuitally connecting the first contact 108a, the second contact 108b, and the output contact 109 of said pixel 101 to, respectively, the first biasing circuit 103a, the second biasing circuit 103b, and the readout circuit 104 of the control unit.

(84) FIGS. 10a-10g present the different steps involved in the process of fabrication of the pixel 101 shown in FIGS. 2a-2b.

(85) In first place, as it can be seen in FIG. 10a, the second light blocking layer 206 is selectively deposited, for example by means of a photomask as used in conventional photolithographic process, on top of the substrate 102 only in the area that will be occupied by the reference element 107 of the pixel 101. Next (FIG. 10b), a passivation layer comprising an oxide is uniformly grown over the substrate to obtain the insulating layer 207, which covers the second light-blocking layer 206 and prepares the substrate for the deposition of the contacts of the pixel 101.

(86) At this stage, the first contact 108 and the second contact 108b are defined at opposite ends of the pixel 101, together with the first conductive trace 110a and the second conductive trace 110b (not shown in FIG. 10c) to provide the first and second biasing voltages V.sub.DD, V.sub.SS. Before defining the output contact 109 (illustrated in FIG. 10d) connected through its corresponding third trace 110c to the readout circuit 104, it is necessary to grow an intervening insulating layer (such as the one described with reference to FIG. 9) to avoid the electrical contact where the conductive traces 110a, 110b, 110c cross.

(87) Afterwards, one or more layers of a two-dimensional material are progressively deposited on the substrate. Then, the transport layer 202 of the photo-active element 106 and the transport layer 204 of the reference element 107 are etched, one next to the other, between the contacts 108a, 108b, 109 previously defined (see FIG. 10e).

(88) Next, FIG. 10f shows the deposition of a photosensitizing material on top of the one or more layers of two-dimensional material, on which the photosensitizing layer 201 of the photo-active element 106 and the photosensitizing layer 203 of the reference element 107 are patterned above their corresponding transport layer 202, 204.

(89) Finally, the first light-blocking layer 205 is laid out selectively on top of the photosensitizing layer 203 of the reference element, as depicted in FIG. 10g. Optionally, at this final stage, a protective encapsulation layer made of a wide-bandgap dielectric material can be disposed above the pixel 101.

(90) The process of fabrication of the pixel shown in FIGS. 3a-3b would be essentially similar to the one just discussed, with the only difference that the deposition of the one or more layers of the two-dimensional material, and the subsequent etching of the transport layers 202, 204, would be carried out prior to the definition of the contacts 108a, 108b, 109.

(91) Referring now to FIG. 11, it is there shown an example of an image sensor capable of multispectral response. The image sensor 1100 comprises a plurality of pixels arranged as a two-dimensional array and grouped into clusters s1-s9. Each cluster comprises at least one pixel having a photo-active element with a photosensitizing layer sensitive to a different range of the spectrum. In this particular example, the photosensitizing layer of the photo-active elements comprises quantum dots, whose size is progressively varied to tune their light absorption properties to different wavelengths.

(92) Referring to FIGS. 13a and 13b, they show a further embodiment of the image sensor of the present invention for which a light-concentrating structure 1300 is arranged on top of the image sensor (above each pixel or above some of its pixels), specifically on top of an insulating layer 1301 disposed above the photosensitive element (for a non illustrated embodiment, it could be arranged directly on top of the image sensor, without an insulated layer in between), in order to enhance the response of the photosensitive element. For the illustrated embodiment, the light-concentrating element 1300 is a plasmonic bull's eye metallic structure, although alternatively other geometries of plasmonic and/or dielectric structures that may consist of metals, dielectrics, heavily doped semiconductors or graphene can be used, the choice of which is determined by the spectral range intended to be covered by the image sensor.

(93) For the embodiment of FIGS. 14a and 14b, the response of the photosensitive element is further enhanced by adding a so-called microlens 1400 on top of each pixel (only one pixel is shown in the Figure).

(94) To demonstrate the spectral tunability of the photosensitive elements of the image sensor of the present invention, a prototype has been built including an arrangement comprising several pixels differing between them in that they are conFigured for being sensitive to different ranges of the light spectrum, in this case by means of the selection of the quantum dots (specifically the sizes thereof) which form their respective photosensitizing layers, one of which is conFigured for being sensitive to Short-wave Infrared light (SWIR), another for Near Infrared light (NIR) and another for visible light (VIS). The resulting waves are depicted in FIG. 15, identified as SWIR-QDs, NIR-QDs and VIS-QDs.

(95) In the plot of FIG. 16 data from a pixelated detector built according to the image sensor of the present invention and that is illuminated with diffracted light is shown, said data showing how the present invention allows measuring the spectral decomposition of the impinging light.

(96) In the plot of FIG. 17 data extracted from a graphene 4-pixel photodetector linear array on a flexible and transparent substrate is shown. The sensors of the array have a 11 mm.sup.2 dimension and a pixel pitch of 1.3 mm. The data is obtained by performing a reflective photoplethysmogram measurement on the finger of a person using a green (532 nm) light emitting diode as the light source. Each of the four depicted curves correspond to a different colour.

(97) FIG. 12 represents the block diagram of an optoelectronic device, in particular a wireless wearable device, which incorporates a photodetector array according to the present invention.

(98) The optoelectronic device 1200 comprises the image sensor 100 described in FIG. 1 arranged on a flexible and/or stretchable substrate 1201, together with an analog-to-digital converter 1202, a control module 1203 and a power supply module 1204 operatively connected to the control unit of the image sensor 100.

(99) The control module 1203 is conFigured to provide control signals 1205 to the control unit of the image sensor 100 to selectively bias and read out the pixels 101, and to receive a plurality of detected values 1206 corresponding to the photo-signals read out from the plurality of pixels 101 by the readout circuit 104. The analog-to-digital converter 1202 is circuitally connected between the image sensor 100 and the control module 1203 and is adapted to digitize the detected values 1206 before they are delivered to the digital circuitry embedded in the control module 1203.

(100) The power supply module 1204 is conFigured to provide the first and second biasing voltages V.sub.DD, V.sub.SS to the first and second biasing circuits 103a, 103b and to energize the active devices of the readout circuit 104.

(101) The optoelectronic device 1200 further comprises an antenna 1207 operatively interfaced with an RF-circuit included in the control module 1203, and that allows the optoelectronic device 1200 to communicate via a wireless connectivity standard (such as WiFi, Bluetooth or ZigBee) with a user terminal 1208 provided with an antenna 1209, such as a mobile telephone. The wireless link between the optoelectronic device 1200 and the user terminal 1208 is advantageously used to program the optoelectronic device 1200 remotely from the user terminal 1208, and to transfer data (such as for instance raw and/or processed data relating to the detected values 1206 corresponding to the photo-signals read out from the pixels 101).

(102) FIG. 18 schematically shows a gaze tracking principle, according to a conventional gaze tracking apparatus and approach, for active IR illumination. The problem is that the camera has to be placed sufficiently close to eye to capture enough IR light at sufficiently high resolution, while not blocking the user's vision. Demands are high resolution and high Q.E. (quantum efficiency).

(103) FIG. 19 schematically shows part of the gaze tracking apparatus of the present invention (some elements have been omitted, such as the control unit, electrically conductive traces, etc.), for an embodiment, and the operation thereof to achieve an improved gaze tracking process. The optical lenses with integrated array of photodetectors (image sensor) capture a much wider range of eye movements and more light. There is a reduced requirement for resolution and high Q.E. Moreover, the apparatus of the present invention can operate with eye-safe infrared light ((>1.1 um, for example 1.5 m). The ability to operate in this wavelength range also avoids any distraction by the active illumination.

(104) The present inventors have built a prototype of the gaze tracking apparatus of the present invention for which the image sensor is an IR camera fabricated directly on a polycarbonate lens (or on an interspersor) using Graphene Quantum Dot photodetection technology. The IR camera covers all or a portion of the entire lens, is semi-transparent and has a built-in photoconductive gain. The user cannot see the camera, but the camera is an integral part of the glass lenses.

(105) The main challenge for a semi-transparent camera is to deal with ambient light. For the design of the above mentioned prototype, the preset inventors focused on the technological solutions for dealing with this ambient light applied to gaze tracking based on a semi-transparent camera that is disposed in between the eye-ball and the object of interest. In FIG. 20, the different light paths involved are schematically drawn together with a schematic representation of the built prototype.

(106) In FIG. 20, four main light sources impinging on the semi-transparent camera are identified:

(107) 1. Ambient light, <.sub.co

(108) 2. Ambient light, <.sub.co, reflected from the eye-ball

(109) 3. Ambient light, >.sub.co, reflected from the eye ball

(110) 4. Reflected light from active illumination

(111) Source 1 does not contain useful information for gaze tracking purposes. Sources 2-4 contain gaze information. Source 2 however, cannot be disentangled from source 1 and hence cannot be used for gaze tracking.

(112) The pixels of the camera can be made of a transport layer and a sensitizing layer. Hybrid photodetectors can be made transparent and can be fabricated on a transparent substrate such as a lens. The lens also contains a short pass filter that blocks all ambient light with >.sub.co. This short pass filter is placed in between the camera and the object of interest.

(113) The camera of the built prototype is characterized in the following:

(114) 1. Semi-transparent (>80% transmission) to visible light (<.sub.co).

(115) 2. Rejects ambient light (sunlight, room light, etc.) influences on the reading

(116) 3. Can be selective to a wavelength or a band of wavelengths with >.sub.co.

(117) 4. Preferably is sensitive in the eye-safe wavelength range between 1300-2000 nm.

(118) The camera of the built prototype can reject ambient light influences using the following techniques:

(119) 1. Active illumination with a wavelength >.sub.co that induces more signal than the ambient light of source 1.

(120) 2. Modulated active illumination and lock-in type read-out.

(121) 3. Enhance the selectivity of the detectors for the wavelength of the active illumination (>.sub.co) over the ambient light of source 1. We can use one of the following techniques to enhance the spectral selectivity of the sensors: Plasmonic filter with a central wavelength that matches the wavelength of the active illumination on the pixels of the camera. Multilayer thin film optical cavity filter that enhances absorption of the wavelength of the active illumination and reduces the absorption in the visible range. Organic sensitizing layer on the hybrid photodetector with a second order transition with an energy <1.71 eV.

(122) The active light source preferably has a wavelength in the eye-safe region between 1300 nm and 2000 nm.

(123) If the spectral selectivity is significant, we can envision using only light source 3 (reflected ambient light with >.sub.co) for capturing gaze information. This can reduce power consumption of the full system dramatically.

(124) The electronics to read, interpret and process the sensors images are located away from the lens (such as in the frame of the eyeglasses, or of a windows or of a screen), as is illustrated in FIGS. 25 and 26. The glass of the spectacles (or of a window or screen) only contains transparent conductive wiring (or very thin and much distanced traces), semi-transparent photodetectors and miniature lenses covering the photodetectors. The built-in photoconductive gain of the photodetectors of the image sensor (especially for the graphene quantum dot photodetectors) enables the above described non-local read-out, i.e. to perform the read-out away from the photodetectors.

(125) Implementations

(126) As already indicated in a previous section of the present document, the gaze tracking apparatus of the present invention can be implemented on different devices as is illustrated in FIGS. 20 to 23.

(127) FIG. 21 shows different possibilities for integrating the apparatus on spectacles. The semi-transparent image sensor that provides the gaze tracking can cover the entire glass (bottom views), or only part (top views) of the glass. For improved accuracy, the gaze tracking apparatus can be placed on both glasses of the spectacles as illustrate in the right views of FIG. 21.

(128) An implementation for a virtual reality application is shown in FIG. 20, where the image sensor is arranged on the lens or visor of a virtual reality goggles or headset, between the eye and a display.

(129) An implementation in an augmented reality system (or smart glasses) or on the visor of a helmet, not shown, can also be performed with the apparatus of the present invention.

(130) Implementations of the apparatus of the invention on the screen of different computing devices are shown in FIG. 22, for different embodiments.

(131) Finally, FIG. 23 represents an implementation of the apparatus of the present invention on a window, where the image sensor is attached or integral with the window partially (left view) or entirely (right view).

(132) While the invention has been described with respect to some specific examples, including presently preferred modes of carrying out the invention, those skilled in the art will appreciate that there are numerous variations and permutations of the above described image sensor and optoelectronic device using said image sensor, including substitution of specific elements by others technically equivalent, without departing from the scope of the invention as set forth in the appended claims.