ULTRAFAST 3D IMAGING TECHNIQUE EMPLOYING EVENT-DRIVEN CAMERAS
20220252731 · 2022-08-11
Inventors
- Wen Li (Troy, MI, US)
- Suk Kyoung Lee (Troy, MI, US)
- Gabriel Stewart (Grosse Pointe Park, MI, US)
- Duke Debrah (Detroit, MI, US)
Cpc classification
G01T5/00
PHYSICS
G01S17/894
PHYSICS
G01S7/4915
PHYSICS
International classification
G01S17/894
PHYSICS
G01S7/4865
PHYSICS
G01S7/4915
PHYSICS
Abstract
A source emits a pulse to an object to generate a particle, and an imaging detector produces a light flash at an X/Y hit position of the particle. The detector outputs a waveform arising from the particle. An event-driven camera provides a signal from the detector that includes intensity and time-over-threshold signals related to the light flash, time-of-arrival information of the event, and the X and Y hit position of the particle. A photodiode determines a time origin of the pulse from the source. A timing circuit is coupled to the detector and to the photodiode, and determines time-of-flight (TOF) of the particle based on the waveform, and based on the time origin of the pulse. The 3D coordinates are generated based on the X/Y hit position synchronized with the TOF of the particle.
Claims
1. A system, comprising: a source configured to emit a pulse of emissions to an object, to generate a particle in the object; an imaging detector positioned to receive the particle, and configured to produce a light flash as an event that is indicative of an X and Y hit position of the particle in the imaging detector, and configured to output a waveform arising from the particle; an event-driven camera directed toward the imaging detector and capable of providing an imaging detector signal that includes, when the event occurs: 1) intensity and time-over-threshold (TOT) signals related to the light flash; 2) time-of-arrival (TOA) information of the event; and 3) the X and Y hit position of the particle based on a location of the light flash; a photodiode positioned to detect signals indicative of a time origin of the pulse from the source; a timing circuit coupled to the imaging detector and coupled to the photodiode, and configured to determine a time-of-flight (TOF) of the particle based on the waveform from the imaging detector, and based on the time origin of the pulse from the photodiode; and a hardware processor and a memory having a program communicatively connected to the hardware processor, the hardware processor being communicatively connected to the timing circuit and to the event-driven camera, the hardware processor providing operations including: generating 3D coordinates (position and time) for the particle based on the X and Y hit position of the particle synchronized with the TOF of the particle.
2. The system of claim 1, wherein the imaging detector is one of a micro-channel plate (MCP)/phosphor imaging detector and an image intensifier.
3. The system according to claim 1, wherein the hardware processing operations further include generating the 3D coordinates by synchronizing 1) a first global time stamp that corresponds with the detected signals from the photodiode with 2) a second global time stamp that corresponds with the TOA from the event-driven camera.
4. The system according to claim 1, wherein the timing circuit includes a digitizer coupled to the hardware processor, the digitizer configured to digitize the signals from the photodiode, and to digitize the TOT signals from the event-driven camera, and the hardware processor determines the TOF based on the digitized signals from the photodiode and based on the digitized TOT signals.
5. The system according to claim 1, wherein the timing circuit includes a time-to-digital converter (TDC) coupled to the hardware processor, and the TDC determines a number of counts that correspond with the TOF based on the time origin of the pulse from the photodiode and based on a time of a peak of the imaging detector signal.
6. The system of claim 1, wherein the source is a laser.
7. The system of claim 1, wherein the particle is one of an ion, an electron, and a photon.
8. The system of claim 1, wherein the timing circuit is triggered by the imaging detector signal.
9. The system of claim 1, further comprising one or more electrodes having openings through which the particle is accelerated from the object to the imaging detector.
10. A method, comprising: configuring a source to emit a pulse of emissions to an object, generating a particle in the object; positioning an imaging detector to receive the particle, the imaging detector configured to produce a light flash as an event that is indicative of an X and Y hit position of the particle in the imaging detector, and to output a waveform arising from the particle; directing an event-driven camera toward the imaging detector and capable of providing an imaging detector signal that includes, when the event occurs: 1) intensity and time-over-threshold (TOT) signals related to the light flash; 2) time-of-arrival (TOA) information of the event; and 3) the X and Y hit position of the particle based on a location of the light flash; positioning a photodiode to detect signals indicative of a time origin of the pulse from the source; coupling a timing circuit to the imaging detector and to the photodiode, and configuring the timing circuit to determine a time-of-flight (TOF) of the particle based on the waveform from the imaging detector, and based on the time origin of the pulse from the photodiode; and communicatively connecting a hardware processor and a memory having a program to the hardware processor, and communicatively connecting the hardware processor to the timing circuit and to the event-driven camera, and providing the hardware processor operations including: generating 3D coordinates (position and time) for the particle based on the X and Y hit position of the particle synchronized with the TOF of the particle.
11. The method according to claim 10, wherein the imaging detector is one of a micro-channel plate (MCP)/phosphor imaging detector and an image intensifier.
12. The method according to claim 10, further providing the hardware processing operations including generating the 3D coordinates by synchronizing 1) a first global time stamp that corresponds with the detected signals from the photodiode with 2) a second global time stamp that corresponds with the TOA from the event-driven camera.
13. The method according to claim 10, wherein coupling the timing circuit includes coupling a digitizer to the hardware processor, and configuring the digitizer to digitize the signals from the photodiode, and to digitize the TOT signals from the event-driven camera, and determining the TOF based on the digitized signals from the photodiode and based on the digitized TOT signals.
14. The method according to claim 10, wherein coupling the timing circuit includes coupling a time-to-digital converter (TDC) to the hardware processor, and the TDC determines a number of counts that correspond with the TOF based on the time origin of the pulse from the photodiode and based on a time of a peak of the imaging detector signal.
15. The method of claim 10, wherein configuring the source includes configuring a laser to emit the pulse.
16. The method of claim 10, wherein generating the particle includes generating one of an ion, an electron, and a photon.
17. The method of claim 10, further comprising triggering the timing circuit by the imaging detector signal.
18. The method of claim 10, further comprising positioning one or more electrodes having openings through which the particle is accelerated from the object to the imaging detector.
19. A method, comprising: generating 3D coordinates for a particle based on an X and Y hit position of the particle synchronized with a time-of-flight (TOF) of the particle, the particle generated from a pulse of emissions directed an object; determining the X and Y hit position from an event-driven camera; and determining the TOF of the particle based on 1) a waveform from an imaging detector that receives the particle, and 2) based on a time origin of the particle, wherein the time origin of the particle is determined in a photodiode that receives the pulse of laser emissions.
20. The method of claim 19, wherein the imaging detector is one of a micro-channel plate (MCP)/phosphor imaging detector and an image intensifier.
21. The method of claim 19, wherein the pulse of emissions is from a laser.
22. The method of claim 19, wherein the particle generated is one of an ion, an electron, and a photon.
23. The method of claim 19, further comprising positioning one or more electrodes having openings through which the particle is accelerated from the object to the imaging detector.
Description
BRIEF DESCRIPTION OF THE DRAWINGS
[0011]
[0012]
[0013]
[0014]
[0015]
[0016]
[0017]
[0018]
[0019]
[0020]
[0021]
[0022]
DETAILED DESCRIPTION
[0023] According to the disclosure, an event-driven camera is demonstrated as a drop-in replacement for a conventional CMOS camera. In one example, the camera-based 3D imaging system achieves an event rate approaching 1 million electron hits per second (Mhits/s), while maintaining its outstanding TOF resolution and very low deadtime for electron detection.
[0024] Disclosed Setup and Methods
[0025] According to the disclosure, an experimental setup is disclosed in
[0026] To demonstrate the disclosed arrangement, the system measured photo-induced thermionic emission from graphene using a high repetition rate laser system, similar with previously reported work. The employed laser was a mode-locked Ti: Sapphire oscillator system (a repetition rate of 80 MHz). The center wavelength was 790 nm and the pulse duration was ˜35 fs. The laser input power was a few tens of mW. Commercially available chemical vapor deposition (CVD) graphene on fused silica surface (graphenesquare.com) was used without further modification or treatment. The sample was placed in a high vacuum chamber (˜10-9 torr) at room temperature and was directly mounted onto the first electrode of the spectrometer. The laser power was varied to yield different event rates (100 Khit/s, 200 Khits/s and 500 Khits/s) as read by the digitizer. The electrons emerging from graphene was accelerated and momentum-focused toward the MCP/P47 phosphor detefdropctor (Photonis APD, 75 mm diameter) by a four-electrode VMI spectrometer. Upon electron impacts, light flashes were produced on the phosphor screen indicating the hit-positions. The positions were then captured by the Tpx3Cam camera and the TOF was obtained by digitizing electrical signals associated with voltage drop in MCP produced by electron hits. The camera was operated in free-run mode, but the high-speed digitizer was triggered by MCP signals. The signal from MCP was first combined with the laser signal picked-off from a photodiode (
[0027] Because the digitizer and the camera cannot be triggered by the laser pulses directly due to the extremely high laser repetition rate, the positional information read from the camera and the TOF from digitizer will have to be synchronized to provide 3D information (2D position plus TOF) for each event. This was achieved offline by matching the global timestamps of the digitizer events with the TOAs of Tpx3Cam events, both of which were available from the metadata associated with each event. Note the TOA is not the TOF of the electron hits but a global timestamp registering the time when a camera event is taking place. The TOA has enough depth to run for several hours during the data acquisition providing the global timestamps with granularity of 1.6 ns while the digitizer timestamps can be as accurate as 1 ns.
[0028] The electron TOF was obtained using a peak detection algorithm on recorded digitizer traces, one of which is shown in
[0029] According to the disclosure and referring again to
[0033] A photodiode 120 is positioned to detect signals from emissions 104 that include emission of a pulse 122 that passes unimpeded from source 102 to photodiode, the signals indicative of a time origin 202 of pulse 122 from source 102 (i.e., pulse 122 does not pass through any materials, such as electrodes 128). A timing circuit 124 is coupled to MCP/phosphor imaging detector 104 and coupled to photodiode 120 via a signal decoupler 126, and configured to determine a time-of-flight (TOF) 204 of particle 108 based on waveform 200 from MCP/phosphor imaging detector 112, and based on time origin 202 of pulse 122 determined from photodiode 120. A hardware processor 130 includes a memory having a program communicatively connected to hardware processor 130, hardware processor 130 being communicatively connected to timing circuit 124 and to event-driven camera 116, the hardware processor providing operations that include generating 3D coordinates (position and time) for particle 108 based on X and Y hit position 118 of particle 108 synchronized with TOF 204 of particle 108.
[0034] Hardware processing operations further include generating the 3D coordinates by synchronizing 1) a first global time stamp that corresponds with the detected signals from photodiode 120 with 2) a second global time stamp that corresponds with TOA 202 from event-driven camera 116. Timing circuit 124 includes, in one example, a digitizer 132 coupled to hardware processor 130. Digitizer 132 is configured to digitize the signals from photodiode 120, and to digitize the TOT signals from event-driven camera 116. Hardware processor 130 determines TOF 204 based on the digitized signals from photodiode 132 and based on the digitized TOT signals.
[0035] In another example, timing circuit 124 includes instead of photodiode 132, a time-to-digital converter (TDC) 134 coupled to the hardware processor, and TDC 134 determines a number of counts that correspond with TOF 204 based on time origin 202 of pulse 104, 122 determined from photodiode 120 and based on a time of a peak of counts of MCP signal 119/200.
[0036] In one example, source 1102 is a laser. In one example, the particle is one of an ion, an electron, and a photon, generated in object 106. In one example, timing circuit 124 is triggered by MCP signal 119/200. In one example, system 100 further includes one or more electrodes 128 having openings 136 through which particle 108 is accelerated from object 106 to MCP/phosphor imaging detector 112. System 100 may include a signal decoupler 138 and an amplifier 140.
[0037] According to the disclosure, referring to
[0038]
[0039] Results and Discussion
[0040] The 3D measurement results are shown in
[0041] To summarize, the disclosed camera-based 3D imaging system is demonstrated by using the Tpx3Cam with great multi-hit capability and time resolution, and is capable of achieving 1 Mhits/s. It is noted that commercial event-driven cameras with timing accuracy of one microsecond would allow to achieve similar 1 Mhits/s performance at a potentially lower cost.
[0042]
[0043] Computers 130 and 142 may include one or more of devices 702a, 702b, server 705, processor 706, memory 708, program 710, Transceiver 112, user interface 714, network 720, and database 722. Device 702 may include any or all of devices 702a, 702b (e.g., a desktop, laptop, or tablet computer). Processor 706 may include a hardware processor that executes program 710 to provide any or all of the operations described herein
[0044] Connections may be any wired or wireless connections between two or more endpoints (e.g., devices or systems), for example, to facilitate transfer of information. The connection may include a local area network, for example, to communicatively connect the devices 702a and 702b with network 720. The connection may include a wide area network connection, for example, to communicatively connect server 705 with network 720. The connection may include a wireless connection, e.g., radiofrequency (RF), near field communication (NFC), Bluetooth communication, Wi-Fi, or a wired connection, for example, to communicatively connect the devices 702a and 702b.
[0045] Any portion of the system may include a computing system and/or device that includes a processor 106 and a memory 108. Computing systems and/or devices generally include computer-executable instructions, where the instructions may define operations and may be executable by one or more devices such as those listed herein. Computer-executable instructions may be compiled or interpreted from computer programs created using a variety of programming languages and/or technologies, including, without limitation, and either alone or in combination, Java language, C, C++, Visual Basic, Java Script, Perl, SQL, PL/SQL, Shell Scripts, Unity language, etc. The system may take many different forms and include multiple and/or alternate components and facilities, as illustrated in the Figures. While exemplary systems, devices, modules, and sub-modules are shown in the Figures, the exemplary components illustrated in the Figures are not intended to be limiting. Indeed, additional or alternative components and/or implementations may be used, and thus the above communication operation examples should not be construed as limiting.
[0046] In general, the computing systems and/or devices may employ any of a number of computer operating systems, including, but by no means limited to, versions and/or varieties of the Microsoft Windows® operating system, the Unix operating system (e.g., the Solaris® operating system distributed by Oracle Corporation of Redwood Shores, Calif.), the AIX UNIX operating system distributed by International Business Machines of Armonk, N.Y., the Linux operating system, the Mac OS X and iOS operating systems distributed by Apple Inc. of Cupertino, Calif., the BlackBerry OS distributed by Research In Motion of Waterloo, Canada, and the Android operating system developed by the Open Handset Alliance. Examples of computing systems and/or devices may include, without limitation, mobile devices, cellular phones, smart-phones, super-phones, next generation portable devices, mobile printers, handheld or desktop computers, notebooks, laptops, tablets, wearables, virtual or augmented reality devices, secure voice communication equipment, networking hardware, computer workstations, or any other computing system and/or device.
[0047] Further, processors such as processor 706 receive instructions from memories such as memory 708 or database 722 and execute the instructions to provide the operations herein, thereby performing one or more processes, including one or more of the processes described herein. Such instructions and other information may be stored and transmitted using a variety of computer-readable mediums (e.g., memory 708 or database 722). Processors such as processor 106 may include any computer hardware or combination of computer hardware that is configured to accomplish the purpose of the devices, systems, operations, and processes described herein. For example, the processor 106 may be any one of, but not limited to single, dual, triple, or quad core processors (on one single chip), graphics processing units, and visual processing hardware.
[0048] A memory such as memory 708 or database 722 may include, in general, any computer-readable medium (also referred to as a processor-readable medium) that may include any non-transitory (e.g., tangible) medium that provides instructions that may be read by a computer. Such a medium may take many forms, including, but not limited to, non-volatile media and volatile media. Non-volatile media may include, for example, optical or magnetic disks and other persistent memory. Volatile media may include, for example, dynamic random access memory (DRAM), which typically constitutes a main memory. Such instructions may be transmitted by one or more transmission media, including radio waves, metal wire, fiber optics, and the like, including the wires that comprise a system bus coupled to a processor of a computer. Common forms of computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD-ROM, DVD, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, a RAM, a PROM, an EPROM, a FLASH-EEPROM, any other memory chip or cartridge, or any other medium from which a computer can read.
[0049] Further, databases, data repositories or other information stores (e.g., memory 708 and database 722) described herein may generally include various kinds of mechanisms for storing, providing, accessing, and retrieving various kinds of information, including a hierarchical database, a set of files in a file system, an application database in a proprietary format, a relational database management system (RDBMS), etc. Each such information may generally be included within to a computing system and/or device employing a computer operating system such as one of those mentioned above, and/or accessed via a network or connection in any one or more of a variety of manners. A file system may be accessible from a computer operating system, and may include files stored in various formats. An RDBMS generally employs the Structured Query Language (SQL) in addition to a language for creating, storing, editing, and executing stored procedures, such as the PL/SQL language mentioned above.
[0050]
[0051]
[0052]
[0053] Thus, according to the disclosure, a system includes a source configured to emit a pulse of emissions to an object, to generate a particle in the object, and an imaging detector positioned to receive the particle, and configured to produce a light flash as an event that is indicative of an X and Y hit position of the particle in the imaging detector, and configured to output a waveform arising from the particle. An event-driven camera is directed toward the imaging detector and capable of providing an imaging detector signal that includes, when the event occurs: intensity and time-over-threshold (TOT) signals related to the light flash, time-of-arrival (TOA) information of the event, and the X and Y hit position of the particle based on a location of the light flash. A photodiode is positioned to detect signals indicative of a time origin of the pulse from the source. A timing circuit is coupled to the imaging detector and coupled to the photodiode, and configured to determine a time-of-flight (TOF) of the particle based on the waveform from the imaging detector, and based on the time origin of the pulse from the photodiode. A hardware processor and a memory having a program communicatively connected to the hardware processor, the hardware processor being communicatively connected to the timing circuit and to the event-driven camera, the hardware processor providing operations including generating 3D coordinates (position and time) for the particle based on the X and Y hit position of the particle synchronized with the TOF of the particle.
[0054] According to the disclosure, a method includes configuring a source to emit a pulse of emissions to an object, generating a particle in the object and positioning an imaging detector to receive the particle, the imaging detector configured to produce a light flash as an event that is indicative of an X and Y hit position of the particle in the imaging detector, and to output a waveform arising from the particle. The method includes directing an event-driven camera toward the imaging detector and capable of providing an imaging detector signal that includes, when the event occurs: intensity and time-over-threshold (TOT) signals related to the light flash, time-of-arrival (TOA) information of the event, and the X and Y hit position of the particle based on a location of the light flash. The method further includes positioning a photodiode to detect signals indicative of a time origin of the pulse from the source, coupling a timing circuit to the imaging detector and to the photodiode, and configuring the timing circuit to determine a time-of-flight (TOF) of the particle based on the waveform from the imaging detector, and based on the time origin of the pulse from the photodiode, and communicatively connecting a hardware processor and a memory having a program to the hardware processor, and communicatively connecting the hardware processor to the timing circuit and to the event-driven camera, and providing the hardware processor operations including generating 3D coordinates (position and time) for the particle based on the X and Y hit position of the particle synchronized with the TOF of the particle.
[0055] According to the disclosure, a method includes generating 3D coordinates for a particle based on an X and Y hit position of the particle synchronized with a time-of-flight (TOF) of the particle, the particle generated from a pulse of emissions directed an object, determining the X and Y hit position from an event-driven camera, and determining the TOF of the particle based on 1) a waveform from an imaging detector that receives the particle, and 2) based on a time origin of the particle, wherein the time origin of the particle is determined in a photodiode that receives the pulse of laser emissions.
[0056] With regard to the processes, systems, methods, heuristics, etc. described herein, it should be understood that, although the steps of such processes, etc. have been described as occurring according to a certain ordered sequence, such processes could be practiced with the described steps performed in an order other than the order described herein. It further should be understood that certain steps could be performed simultaneously, that other steps could be added, or that certain steps described herein could be omitted. In other words, the descriptions of processes herein are provided for the purpose of illustrating certain examples, and should in no way be construed so as to limit the claims.
[0057] Accordingly, it is to be understood that the above description is intended to be illustrative and not restrictive. Many examples and applications other than those provided would be apparent upon reading the above description. The scope should be determined, not with reference to the above description, but should instead be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled. It is anticipated and intended that future developments will occur in the technologies discussed herein, and that the disclosed systems and methods will be incorporated into such future examples. In sum, it should be understood that the application is capable of modification and variation.
[0058] All terms used in the claims are intended to be given their broadest reasonable constructions and their ordinary meanings as understood by those knowledgeable in the technologies described herein unless an explicit indication to the contrary in made herein. In particular, use of the singular articles such as “a,” “the,” “said,” etc. should be read to recite one or more of the indicated elements unless a claim recites an explicit limitation to the contrary.
[0059] The Abstract of the Disclosure is provided to allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in various examples for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claims require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed example. Thus the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separately claimed subject matter.