Detection Apparatus with Optical Detector and Particle Detection Method
20250277736 ยท 2025-09-04
Inventors
Cpc classification
G01N21/17
PHYSICS
International classification
G01N21/17
PHYSICS
Abstract
An example method for detecting target particles in a detection apparatus includes receiving, by at least one processor of the detection apparatus, a first image data at a first wavelength from an image sensor of the detection apparatus. The method includes receiving, by the at least one processor, a second image data at a second wavelength from a particle detector of the detection apparatus. The method includes obtaining, by the at least one processor, a 2D image based on the first image data. The method includes obtaining, by the at least one processor, a depth map data based on the first image data. The method includes obtaining, by the at least one processor, a particle map data based on a ratio of the second image data to the 2D image. The method includes determining, by the at least one processor, a concentration result based on the depth map data and the particle map data.
Claims
1. A method for detecting target particles in a detection apparatus, the method comprising: receiving, by at least one processor of the detection apparatus, a first image data at a first wavelength from an image sensor of the detection apparatus; receiving, by the at least one processor of the detection apparatus, a second image data at a second wavelength from a particle detector of the detection apparatus; obtaining, by the at least one processor of the detection apparatus, a 2D image based on the first image data; obtaining, by the at least one processor of the detection apparatus, depth map data based on the first image data; obtaining, by the at least one processor of the detection apparatus, a particle map data based on a ratio of the second image data to the 2D image; and determining, by the at least one processor of the detection apparatus, a concentration result based on the depth map data and the particle map data.
2. The method of claim 1, further comprising determining, by the at least one processor of the detection apparatus, a 3D image based on the first image data.
3. The method of claim 1, further comprising storing, by the at least one processor of the detection apparatus, a calibration parameter obtained from a ratio of light output power of the image sensor and the particle detector.
4. The method of claim 3, further comprising determining, by the at least one processor of the detection apparatus, a concentration result based on the calibration parameter, the depth map data, and the particle map data.
5. The method of claim 1, wherein the image sensor comprises a plurality of photodetectors forming a pixel array and at least one light emitter.
6. The method of claim 5, wherein at least a portion of the pixel array is a time-of-flight sensor.
7. The method of claim 1, wherein the particle detector comprises a light emitter and an optical detector.
8. The method of claim 7, wherein the optical detector comprises germanium (Ge), tin (Sn), and silicon (Si).
9. The method of claim 1, wherein the first wavelength is different from the second wavelength.
10. The method of claim 1, wherein light with the first wavelength is not absorbed by the target particles.
11. The method of claim 1, wherein the at least one processor includes a 3D image calculating module configured to provide a 3D image, a particle map calculating module configured to provide the particle map data, and a concentration calculating module configured to provide the concentration result.
12. The method of claim 1, wherein the particle map data contains the particle distribution information.
13. The method of claim 1, wherein the concentration result is an average concentration of the target particles.
14. The method of claim 1, wherein the concentration result is a total amount of the target particles.
15. A detection apparatus configured to detect target particles, comprising: an image sensor configured to provide a first image data at a first wavelength; a particle detector configured to provide the second image data at a second wavelength; and at least one processor coupling to the image sensor and the particle detector and configured to: receive the first image data; receive the second image data; obtain a 2D image based on the first image data; obtain depth map data based on the first image data; obtain a particle map data based on a ratio of the second image data to the 2D image; and determine a concentration result based on the depth map data and the particle map data.
16. The detection apparatus of claim 15, wherein the at least one processor is configured to determine a 3D image based on the first image data.
17. The detection apparatus of claim 15, wherein the at least one processor is configured to store a calibration parameter obtained from a ratio of light output power of the image sensor and the particle detector.
18. The detection apparatus of claim 17, wherein the at least one processor is configured to determine a concentration result based on the calibration parameter, the depth map data, and the particle map data.
19. The detection apparatus of claim 15, wherein the image sensor comprises a plurality of photodetectors forming a pixel array and at least one light emitter.
20. The detection apparatus of claim 19, wherein at least a portion of the pixel array is a time-of-flight sensor.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0025] The foregoing aspects and many of the advantages of this application will become more readily appreciated as the same becomes better understood by reference to the following detailed description, when taken in conjunction with the accompanying drawings:
[0026]
[0027]
[0028]
[0029]
[0030]
[0031]
[0032]
[0033]
[0034]
DETAILED DESCRIPTION
[0035] The following embodiments accompany the drawings to illustrate the concept of the present disclosure. In the drawings or descriptions, similar or identical parts use the same reference numerals, and in the drawings, the shape, thickness, or height of the element can be reasonably expanded or reduced. The embodiments listed in the present application are only used to illustrate the present application and are not used to limit the scope of the present application. Any obvious modification or change made to the present application does not depart from the spirit and scope of the present application.
[0036] The present disclosure relates to a detection apparatus that can include an optical sensor device to detect the presence of the target particles and/or the concentration of the target particles in the reference substance. For example, an alcohol detection apparatus can be used to detect the concentration of ethanol in the liquid or gas. The liquid can be water, blood, etc. The gas can be air, e.g., breathing air exhaled by a person. The detection apparatus can be installed in a portable device used by a person, for example, to perform alcohol detection before driving vehicles or entering a certain area. The portable device may be a mobile phone, a handheld instrument, an earbud, a pair of glasses, a helmet, a wristband, a watch, a ring, etc.
[0037]
[0038] As shown in
[0039] The optical detector 122 can include a single photoelectronic device or a plurality of photoelectronic devices arranged in an array. In an embodiment, the optical detector 122 includes a plurality of photoelectronic devices arranged in a one-dimensional array or a two-dimensional array. The photoelectronic device can include a supporting substrate and a detecting region supported by the supporting substrate. The detecting region can include group IV materials or group III-V materials and is configured to absorb photons. The group IV materials can include silicon (Si) or germanium (Ge). The group III-V materials can include Al, Ga, In, N, P, As, Sb, or any combination thereof. The supporting substrate can include a material, such as silicon, different from that of the detecting region. The optical detector 122 can detect visible light, or non-visible light according to the application. The visible light can include blue, navy, green, yellow, or red light. The non-visible light can include NIR or SWIR.
[0040] The light emitter 121 can be semiconductor light-emitting elements, such as a light-emitting diode (LED), a laser diode, a vertical-cavity surface-emitting laser (VCSEL), or an organic light-emitting diode (OLED). The light emitter 121 can emit light with a wavelength corresponding to the detecting wavelength of the optical detector 122.
[0041]
[0042] The reference substance to be tested may be a mixture, which contains target particles mixed in the reference substance, such as alcohol mixed in the water, or alcohol mixed in the air. The target particles may have high absorbance or low transmittance relative to an optical signal with a specific wavelength. However, other substances that do not need to be detected (such as reference substance) also may have high absorbance or low transmittance relative to the optical signal with this specific wavelength. Therefore, the light intensity of the optical signal with this specific wavelength received by the optical detector is not only affected by the absorbance of the target particles but also by the absorbance of other substances that do not need to be detected, and the optical sensor device may obtain erroneous determination. In addition, the optical characteristics of the light emitter may change as the ambient temperature or usage time. The wavelength of the optical signal generated by the light emitter driven by the same driving current may change due to different ambient temperatures or long usage time. It would cause the wavelength of the optical signal to drift to a band easily interfered with by other substances that do not need to be detected.
[0043] Accordingly, before detecting the presence of target particles, the optical sensor device needs to determine a testing optical signal emitted from the light emitter and to be at a target wavelength. The target particles have high absorbance or low transmittance relative to the testing optical signal with the target wavelength, and other substances that do not need to be detected (such as reference substance) have low absorbance or high transmittance (for example, the light transmittance is greater than 98%) relative to the testing optical signal with the target wavelength. The process of determining the testing optical signal can reduce the influence of other substances on the detection of the presence or concentration of target particles, thereby improving the accuracy of detection results. Different wavelengths of the optical signal can be changed by applying various driving currents to the light emitter through the processor. Taking a gas sample as a reference substance as an example, the detection apparatus can be used to detect the presence or concentration of alcohol in the gas sample. In one case, the gas sample is breathing air containing a lot of moisture, so the optical sensor device needs to choose a testing optical signal with a target wavelength that is high absorbance or low transmittance relative to ethanol but low absorbance or high transmittance relative to water. When detecting the gas sample with alcohol, most of the testing optical signal is absorbed only by the alcohol and not by the water.
[0044]
[0045]
[0046] In S403, the one or more processors obtain a plurality of reference light intensities from the optical detector. Each of the reference light intensities corresponds to a specific wavelength of the plurality of wavelengths. The one or more processors also can record each of the driving currents corresponding to each of the reference light intensities. Since there is no presence of the target particles in the detection apparatus, the optical sensor device can recognize a portion of reference optical signals that can be absorbed by the reference substance that does not need to be tested by comparing each of the reference light intensities.
[0047] In S405, the one or more processors determine a target wavelength of the plurality of wavelengths based on the plurality of reference light intensities. The light intensity at the target wavelength would be strong due to the reference substance having low absorbance or high transmittance relative to the target wavelength. The one or more processors also can determine a target driving current corresponding to the target wavelength. In one implementation, the one or more processors can calculate a plurality of decision indices corresponding to each of the reference light intensities and determine the target wavelength based on the decision indices. The decision indices correlate to the absorbance and transmittance of the reference substance for the plurality of wavelengths. In another implementation, the optical sensor device can obtain a plurality of decision indices by comparing the intensity of each of the reference optical signals from the light emitter with the corresponding reference light intensity from the optical detector. In another implementation, the one or more processors determine the highest reference light intensity of the plurality of reference light intensities, and the one or more processors determine a target wavelength corresponding to a wavelength with the highest reference light intensity.
[0048] In S407, the one or more processors control the light emitter to emit a testing optical signal with the target wavelength in the detection apparatus. In an embodiment, a person can blow the breathing air into the detection apparatus through the inlet, and then one or more processors control the light emitter to emit a testing optical signal with the target wavelength in the detection apparatus for detecting the presence and/or concentration of alcohol in the breathing air. In another embodiment, a liquid is injected into the detection apparatus through the inlet, and then one or more processors control the light emitter to emit a testing optical signal with the target wavelength in the detection apparatus for detecting the presence or concentration of alcohol in the liquid. The target particles (e.g., ethanol) have high absorbance or low transmittance relative to the testing optical signal and the reference substance (e.g., water) has low absorbance or high transmittance relative to the testing optical signal. For example, the target wavelength of the testing optical signal can be within one of zone 1 to zone 4 in
[0049] In S409, the one or more processors obtain an output light intensity from the optical detector. A small portion of the testing optical signal at the target wavelength may be absorbed by the reference substance. In an implementation, the one or more processors can calibrate the output light intensity based on the transmittance of the reference substance at the target wavelength to obtain more accurate light intensity affected by the target particles. In S411, the one or more processors determine an output representing at least one of (i) a presence of target particles or (ii) a concentration of the target particles based on the output light intensity. Since most of the testing optical signal is not absorbed by the reference substance, the output light intensity from the optical detector is only affected by the target particles, which can be used to determine the presence or concentration of the target particles through mathematical methods. In another implementation, the one or more processors determine the presence of the target particles by comparing whether the output light intensity is below a predetermined threshold. In another implementation, the one or more processors can display the output through a display component of the detection apparatus.
[0050] In an implementation, the light emitter 121 can be a tunable single-wavelength laser diode or VCSEL, which can precisely control the wavelength of the emitting light by controlling the driving current.
[0051] In an implementation, the mathematical methods used to determine the concentration of the target particles may be based on the Beer-Lambert law principle, which utilizes the absorption ratio of light to determine the concentration level of the target particles.
[0052] In another embodiment, the detection apparatus may be used to detect the target particles generated by a remote object and therefore does not have a chamber structure. For example, an alcohol detection apparatus is installed in a car to detect whether the driver's breath contains alcohol for safety. In the past, courts in some countries have required some drivers to use automatic car ignition interlock devices based on their criminal records, which are integrated and coupled to breathalyzers. However, some countries have now legislated to require new cars to install such automatic ignition devices to detect whether the driver's breath contains alcohol and perform driver identification to enhance public safety.
[0053]
[0054]
[0055] The particle detector 620 can be implemented as an optical sensor device and can refer to the aforementioned optical sensor device 120. The particle detector 620 is configured to provide a 2.sup.nd image data and may include an optical detector 621 and a light emitter 622. The light emitter 622 may include a steady-state, low-power light source (e.g., around 1 mW), which can be one of the light emitters shown in
[0056] The processor 630 may include a 3D image calculating module 631, a particle map calculating module 632, and a concentration calculating module 633. The 3D image calculating module 631 couples to the image sensor 610 to receive 1.sup.st image data from the image sensor 610. The 1.sup.st image data may include depth map data and 2D image which is a non-depth image. The 3D image calculating module 631 is configured to perform 3D image calculation and generate 3D image according to the 1.sup.st image data for object recognition. The particle map calculating module 632 couples to the 3D image calculating module 631 to receive the 2D image and to the particle detector 620 to receive the 2.sup.nd image data. The 2.sup.nd image data may include the object morphology and can also show the distribution of the target particle. The 2D image from the image sensor 610 only contains object morphology. Therefore, the particle map calculating module 632 can obtain the particle map data which includes the particle distribution information according to a comparison between the 2D image and the 2.sup.nd image data. The particle map data only carries information about the optical absorption of the target particles along the line of sight of each pixel. The natural logarithm of optical absorption of the target particles can translate to be the column density which is a product of path length and concentration. Therefore, the processor 630 can obtain the concentration result by obtaining the path length and performing mathematical calculations on the particle map data.
[0057] The concentration calculating module 633 couples to the 3D image calculating module 631 to receive the depth map data and to the particle map calculating module 632 to receive the particle map data. The depth map data from the image sensor 610 contains the path length information. Then, the concentration calculating module 633 can determine the concentration result according to the depth map data and the particle map data.
[0058] In an implementation, the image sensor 610 is configured to operate at a first wavelength .sub.A, and the optical power of the light emitter 612 of the image sensor is P.sub.out(.sub.A). The particle detector 620 is configured to operate at wavelength .sub.B, and the optical power of the light emitter 622 of the particle detector 620 is P.sub.out(.sub.B). The reflectance of the object (e.g., driver's face) for the wavelength .sub.A is R.sub..sub.
[0059] The 2.sup.nd image data from the particle detector 620 is P.sub..sub.
[0060] For simplicity, all angle-of-incidence and emission-profile related effects are lumped into the reflection R.sub..sub.
[0061] In an implementation, the wavelength .sub.B can be 1310 nm, which is outside of the absorption band of ethanol. The wavelength .sub.A can be 1393 nm, which is within the absorption band of ethanol. The wavelengths .sub.A and .sub.B are close enough and have similar reflection coefficients R.sub..sub.
[0062] The quantity P.sub.BA can be a system calibration parameter and can be obtained from the ratio of the image sensor's and particle detector's light output power P.sub.out. In an implementation, the P.sub.BA can be obtained when the detection apparatus starts to operate and is stored in a memory or automatically calibrated periodically to reduce the drift caused by temperature and aging.
[0063] When the target particle is present, the particle map data P(i, j) can be derived by
[0064] Here n.sub.i,j is the average concentration of the target particle (e.g., ethanol) in the air along the path length of each pixel, and (.sub.B) is the absorption of the target particle relative to wavelength .sub.B. For example, ethanol vapor dispersed in the air, (.sub.B)0.910.sup.20 cm.sup.2. Hence, the average concentration n.sub.i,j of each pixel can be derived by
[0065] Therefore, the concentration calculation module 633 can obtain the average concentration n.sub.i,j of each pixel through the particle map data P(i, j) and the depth map data d(x.sub.i, y.sub.j). In one embodiment, the concentration calculation module 633 can output the average concentration as the concentration result. The concentration result can be determined by the concentration calculating module 633 by averaging n.sub.i,j of each pixel or selecting the maximum value of n.sub.i,j. The average concentration may be lower than the peak concentration. The automatic ignition device can deactivate the ignition device when the average concentration of alcohol detected by an alcohol detection apparatus installed in the car exceeds a threshold.
[0066] In another embodiment, the concentration calculation module 633 can output the total amount of the target particle as the concentration result, such as the total amount of alcohol exhaled in the driver's single breath. x.sub.iy.sub.j is the unit projection area corresponding to each i, j pixel on the object, and the distance from each i, j pixel to the unit projection area x.sub.iy.sub.j is d(x.sub.i, y.sub.j). Then, a volume element corresponding to each i, j pixel to the object can be represented by a physical pyramid dV.sub.i,j.
[0067] The following is the calculation formula for the total amount m of a target particle. The total amount m of target particle can be obtained by integrating the average concentration n.sub.i,j over all the pyramid volume element dV.sub.i,j. is the molecular weight of the target particle, and the unit projected area x.sub.iy.sub.j can be represented by S.sub.i,j.
[0068] The total amount of a target particle m can be regarded as the weighted sum of each pixel value in the particle map data P(i, j). The weighting coefficient is the corresponding unit projected area S.sub.i,j. When the target particle is not present, P(i, j)=P.sub.BA, the total amount of target particle m is calculated to be zero. In an implementation, when the target particle is alcohol, the ethanol molecular weight =7.6510.sup.26 kg.
[0069] The above summation equation of the total amount of target particle m also can be expressed in terms of .sub.x and .sub.y which are the per-pixel angular resolutions of the image sensor 610. It is assumed that .sub.x and .sub.y are both small. The calculation formula for the total amount of a target particle m is as follows:
[0070] The total amount of a target particle m can be regarded as the weighted sum of each pixel value in the particle map data P(i, j). The weighting coefficient is related to the depth map data d(x.sub.i, y.sub.i) obtained from the image sensor 610 operation.
[0071] Therefore, the detection apparatus 600 can obtain the object recognition through the 3D image and simultaneously (during at least a portion of overlapping time periods) obtain the presence or concentration of the target particle through the concentration result. The detection apparatus 600 may require about tens of milliseconds to obtain a few frames including the depth image to determine the 3D image and obtain particle concentration information at the same time.
[0072] In another implementation, the particle map calculating module 632 of the processor 630 can couple to the image sensor 610 and obtain 2D image from 1.sup.st image data. The concentration calculation module 633 of the processor can couple to the image sensor 610 and obtain depth map data from 1.sup.st image data.
[0073]
[0074]
[0075]
[0076] Various means can be configured to perform the methods, operations, and processes described herein. For example, any of the systems and apparatuses (e.g., optical sensing devices and related circuitry) can include unit(s) and/or other means for performing their operations and functions described herein. In some implementations, one or more of the units may be implemented separately. In some implementations, one or more units may be a part of or included in one or more other units. These means can include processor(s), microprocessor(s), graphics processing unit(s), logic circuit(s), dedicated circuit(s), application-specific integrated circuit(s), programmable array logic, field-programmable gate array(s), controller(s), microcontroller(s), and/or other suitable hardware. The means can also, or alternately, include software control means implemented with a processor or logic circuitry, for example. The means can include or otherwise be able to access memory such as, for example, one or more non-transitory computer-readable storage media, such as random-access memory, read-only memory, electrically erasable programmable read-only memory, erasable programmable read-only memory, flash/other memory device(s), data register(s), database(s), and/or other suitable hardware.
[0077] As used herein, the terms such as first, second, third, etc. describe various elements, components, regions, layers and/or sections, these elements, components, regions, layers and/or sections should not be limited by these terms. These terms may be only used to distinguish one element, component, region, layer, section, signal, or operation from another. The terms such as first, second, third, etc. when used herein do not imply a sequence or order unless clearly indicated by the context. The terms light-receiving, light-detecting, light-sensing and any other similar terms can be used interchangeably.
[0078] Aspects of the disclosure have been described in terms of illustrative embodiments thereof. Numerous other embodiments, modifications, and/or variations within the scope and spirit of the appended claims can occur to persons of ordinary skill in the art from a review of this disclosure. Any and all features in the following claims can be combined and/or rearranged in any way possible. Accordingly, the scope of the present disclosure is by way of example rather than by way of limitation, and the subject disclosure does not preclude inclusion of such modifications, variations or additions to the present subject matter as would be readily apparent to one of ordinary skill in the art. Moreover, terms are described herein using lists of example elements joined by conjunctions such as and, or, but, etc. It should be understood that such conjunctions are provided for explanatory purposes only. Lists joined by a particular conjunction such as or, for example, can refer to at least one of or any combination of example elements listed therein. Also, terms such as based on should be understood as based at least in part on.
[0079] Those of ordinary skill in the art, using the disclosures provided herein, will understand that the elements of any of the claims discussed herein can be adapted, rearranged, expanded, omitted, combined, or modified in various ways without deviating from the scope of the present disclosure.
[0080] While the disclosure has been described by way of example and in terms of a preferred embodiment, it is to be understood that the disclosure is not limited thereto. On the contrary, it is intended to cover various modifications and similar arrangements and procedures, and the scope of the appended claims therefore should be accorded the broadest interpretation so as to encompass all such modifications and similar arrangements and procedures.