NON-INVASIVE DIFFUSE ACOUSTIC CONFOCAL THREE-DIMENSIONAL IMAGING
20190117189 ยท 2019-04-25
Inventors
Cpc classification
A61B8/40
HUMAN NECESSITIES
A61B8/5223
HUMAN NECESSITIES
A61B8/085
HUMAN NECESSITIES
A61B8/4477
HUMAN NECESSITIES
A61B8/4281
HUMAN NECESSITIES
A61B8/4494
HUMAN NECESSITIES
International classification
Abstract
A method and a device for imaging an object in a first material having a different optical density to the object. The method comprises: focusing an acoustic coherent beam to a virtual acoustic source in a fluid or an amorphous second material outside of the first material; moving the virtual acoustic source in the fluid or amorphous material such that at least a plurality of scattered beams from the virtual acoustic source scan the first material and at least one scattered beam is reflected from the object to form a reflected beam and at least one scattered beam bypasses the object to form a bypass beam, and wherein the reflected beam and the bypass beam intercept one another to form a coherent interference zone; and defocusing the coherent interference zone to provide a Fresnel fringe, the Fresnel fringe forming an image of the object.
Claims
1. A method of imaging an object in a first material having a different optical density to the object, the method comprising: focusing an acoustic coherent beam to a virtual acoustic source in a fluid or an amorphous second material outside of the first material; moving the virtual acoustic source in the fluid or amorphous material such that at least a plurality of scattered beams from the virtual acoustic source scan the first material and at least one scattered beam is reflected from the object to form a reflected beam and at least one scattered beam bypasses the object to form a bypass beam, and wherein the reflected beam and the bypass beam intercept one another to form a coherent interference zone; and defocusing the coherent interference zone to provide a Fresnel fringe, the Fresnel fringe forming an image of the object.
2. The method of claim 1, further comprising an acoustic detector detecting the image of the object.
3. The method of claim 2, further comprising moving the virtual acoustic source in the fluid or amorphous material such that at least one scattered beam passes through the object to become an object beam and is detected by the acoustic detector.
4. The method of claim 3, further comprising comparing the phase of the object beam with the phase the bypass beam to provide information about the object.
5. The method of claim 4, further comprising comparing the amplitude of the object beam with the amplitude of the bypass beam to provide information about the object.
6. The method of claim 5, wherein the information from the phase is the temperature, composition, magnetic field or electrostatic field of the object and the information from the amplitude is the optical density of the object.
7. The method of claim 6, wherein the speed of sound of the object is determined.
8. The method of claim 7, wherein the speed of sound of the object is used to identify the object.
9. The method of claim 1, wherein the object is identified as a tumour or lesion in the first material.
10. The method of claim 9, wherein the acoustic coherent beam is focused in a fluid or amorphous second material in the body of a patient.
11. The method of claim 10, wherein the acoustic coherent beam is focused in one of urine in the bladder, fatty tissue, brain tissue, peritoneum fluid, stomach fluid, the gall bladder and the spleen.
12. The method of claim 9, wherein the acoustic coherent beam is focused in a fluid outside of the body of a patient in which at least the part of the body of interest is immersed.
13. A system for imaging an object in a first material of different optical density to the object, the system comprising: a coherent acoustic beam source which emits a coherent acoustic beam; a focuser positioned to focus the coherent acoustic beam to a virtual acoustic imaging source, the virtual acoustic imaging source emitting scattered beams; a coherent acoustic beam source actuator in mechanical communication with the coherent acoustic beam source; a focuser actuator in mechanical communication with the focuser; a processor in electronic communication with the coherent acoustic beam source actuator and the focuser actuator; a memory in communication with the processor and having instructions thereon to instruct the processor to move the coherent acoustic beam source and the focuser such that at least some of the scattered beams are reflected off the object to be reflected beams and pass by the object to be bypass beams and such that the reflected beams and the bypass beams form an interference zone, the memory further configured to move the coherent acoustic beam and the focuser to produce a Fresnel fringe in the interference zone; and an acoustic detector positioned to image the Fresnel fringe.
14. The system of claim 13, wherein the memory includes instructions for the processor to sharpen the image.
15. The system of claim 14, further comprising a spatial filter in front of the acoustic detector.
16. The system of claim 15, further comprising a cone reflector, the cone reflector between the coherent acoustic beam source and the focuser.
17. The system of claim 16, further comprising a pair of articulating arms, each with a distal end, and a second acoustic detector, each acoustic detector mounted on an arm, proximate the distal end.
18. The system of claim 17, sized for imaging ovaries.
Description
FIGURES
[0035]
[0036]
[0037]
[0038]
[0039]
[0040]
[0041]
[0042]
[0043]
DESCRIPTION
[0044] Except as otherwise expressly provided, the following rules of interpretation apply to this specification (written description and claims): (a) all words used herein shall be construed to be of such gender or number (singular or plural) as the circumstances require; (b) the singular terms a, an, and the, as used in the specification and the appended claims include plural references unless the context clearly dictates otherwise; (c) the antecedent term about applied to a recited range or value denotes an approximation within the deviation in the range or value known or expected in the art from the measurements method; (d) the words herein, hereby, hereof, hereto, hereinbefore, and hereinafter, and words of similar import, refer to this specification in its entirety and not to any particular paragraph, claim or other subdivision, unless otherwise specified; (e) descriptive headings are for convenience only and shall not control or affect the meaning or construction of any part of the specification; and (f) or and any are not exclusive and include and including are not limiting. Further, the terms comprising, having, including, and containing are to be construed as open-ended terms (i.e., meaning including, but not limited to,) unless otherwise noted.
[0045] Recitation of ranges of values herein are merely intended to serve as a shorthand method of referring individually to each separate value falling within the range, unless otherwise indicated herein, and each separate value is incorporated into the specification as if it were individually recited herein. Where a specific range of values is provided, it is understood that each intervening value, to the tenth of the unit of the lower limit unless the context clearly dictates otherwise, between the upper and lower limit of that range and any other stated or intervening value in that stated range, is included therein. All smaller sub ranges are also included. The upper and lower limits of these smaller ranges are also included therein, subject to any specifically excluded limit in the stated range.
[0046] Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the relevant art. Although any methods and materials similar or equivalent to those described herein can also be used, the acceptable methods and materials are now described.
[0047] An overview of the system, generally referred to as 8, for imaging a tissue, an organ, or a body part (an object), generally referred to as 10, is shown schematically in
[0048] The coherent acoustic beam 14 has a large cross-sectional area, typically on the order of a centimeter or a few centimeters. The coherent acoustic beam 14 is directed to a cone shaped reflector 22 and then to a focusing mirror or lens 24 where it is reflected by a curved surface and focused into a convergent beam 30 that is transmitted into a fluid or amorphous medium 32. The focusing mirror 24 pivots under control of a focusing mirror actuator 26, which is under control of the processor 18, which in turn is controlled by the memory 19, which has instructions thereon for instructing the processor 18 to actuate the actuator 26. The cone shaped reflector 22 is under control of an actuator 28 that moves it towards and away from the acoustic emitter (source) 12. The actuator 28 which is under control of the processor 18, which in turn is controlled by the memory 19, which has instructions thereon for instructing the processor 18 to actuate the actuator 28. In one embodiment, the fluid or amorphous medium 32 is within the body and is, for example, but not limited to, urine in the bladder, fatty tissue, brain tissue, peritoneum fluid, stomach fluid, the gall bladder and the spleen. The convergent beam 30 converges to a point which is a virtual focused acoustic imaging source 34 at the point of cross-over. The processor 18 under control of the memory 19 is configured to direct the source actuator 16 to cause the coherent acoustic source 12 to move the source 34 into the fluid or amorphous medium 32 and to move it around within the fluid or amorphous medium 32. Further, the processor 18 under control of the memory 19 is configured to move the cone shaped reflector 22 towards and away from the focusing mirror 24, thus moving the virtual source 34 towards and away from the acoustic emitter 12, again positioning the virtual source 34 in the fluid or amorphous medium 32. The virtual source 34 is located outside of the object being imaged and inside the fluid or amorphous medium 32. The source 34 transmits a plurality of beams 36 that are scattered in all directions three-dimensionally. The scattered beams 36 pass out of the medium 32. By moving the source 34 in the fluid or amorphous medium 32, the plurality of beams 36 scan around outside of the medium 32. The beams 36 enter into any object 10 that they encounter, then out of the object 10 as object beams 38, which are detected by an acoustic detector 40. The acoustic detector 40 is aimed at the source 34 such that it can detect the object beams 38. The object beams 38 are refracted as they pass through the object 10. The acoustic detector 40 can move to collect object beams 38 having a range of angular directions. The object beams 28 are refracted as they pass through the object 10. The acoustic detector 40 moves towards and away from the object 10 in order to defocus the image created by combinations of reflected beams 37 and bypass beams 39, such that it becomes photographically visible. A detector actuator 42 is in mechanical communication with the acoustic detector 40 and is under control of the processor 18 that is in electronic communication with the detector actuator 42. Again, the processor 18 receives instructions from the memory 19. The object beams 38 also contain information about the object 10. The information carried by the object beams 38 is analyzed to determine its amplitude and phase according to techniques known in the art. The phase information of the object beam 38 provides information on the object's temperature, composition, magnetic field or electrostatic field and amplitude measurements provide information on the opaqueness or density of the object. A spatial filter 46 reduces the noise from unwanted scattered beams 36 and is located in front of the acoustic detector 40.
[0049] A couplant 60 is used between the patient and each of the coherent acoustic source 12 and the acoustic detector 40. Alternatively, the patient may be immersed in fluid, or the relevant part of the patient may be immersed in fluid, such that there is a fluid interface between the patient and the device 8. In order for the object 10 to be observed, the source 34 is moved inside the medium 32 by pivoting the focusing mirror 24 and the acoustic detector 40, or by shifting the microscope 6, or by repositioning the patient. A vector network analyzer is not required as the amplitude and phase information of the emitted and received intensities is not used to produce the image. However, a better intensity image can result using the vector network analyzer for the temporal filter. An intensity image using Fresnel fringes will form without using the temporal filter and spatial filter but using these filters the intensity image will improve, i.e., better spatial resolution, by being able to reduce the apparent size of the virtual source.
[0050] For phase or speed of sound imaging, the vector network analyzer is needed to measure the time difference for receiving the acoustic beam at each element in the detector. Since the path length traveled by the acoustic beam from the emitter/lens assembly to the detector, by measuring the time using the vector network analyzer, the speed (m/s) can be determined.
[0051]
[0052] Using the diffuse acoustic confocal imager, the emitted coherent beam 14 is very flat (spatial coherence) and are constant (temporal coherence). The coherent beam 14 is focused to a convergent beam 30 which converges to a point, the virtual source 34, which emits scattered beams 36. The virtual source 34 can be moved around to allow the scattered beams 36 to hit the object 10 from many spots, distances and angles. The scattered beams 36 do one of miss the object 10, hit the edge of the object and reflect off the edge or pass through the object 10.
[0053] Those that are reflected off the object are referred to as reflected beams 37. Those that pass through the object are referred to as the object beams 38. Those that miss the object but overlap with the reflected beams 37 are called bypass beams 39.
[0054] The object beams 38 can be considered for diagnostic purposes of the object 10. To form a speed of sound image using temporal coherence, the emission time of the beam is measured and then the arrival time of the beam at each pixel in the image is measured. Any differences in the speed of sound across the image can be used to diagnose structures in the image.
[0055] The spatial coherent interference between the reflected beams 37 and the bypass beams 39 in the overlap is used to create the image. The image of the object 10 can be considered as an inline hologram for diagnostic purposes of the object 10. More specifically, the image is created using the principle of Fresnel diffraction. The bypass beam 39 overlaps with the reflected beam 37 to produce an interference zone 141. This interference zone 141 between these two beams form the Fresnel fringe or defocus fringe. The retention of coherence in the bypass beam 39 and the reflected beam 37 is required for forming an interference fringe or Fresnel fringe in the image. Some intensity in the defocus or Fresnel fringe also comes from the refracted object beams 38 exiting the object 10 and interfering with the bypass beam 39 in the same manner in which the bypass beam and the reflected beam interfere although the refracted object beam has a less significant role in the image formation process of the Fresnel fringe.
[0056] The Fresnel diffraction produces a fringe (Fresnel fringe) in an acoustic image when defocusing occurs. The Fresnel fringe enhances the contrast between the forms and the background and allows for the imaging of soft tissues, and the interface between different soft tissues. This includes tissues that have very little difference in refractive indexes, such as for example, but not limited to, breast tissue and milk glands in the breast tissue, lesions in tissues, an egg in a fallopian tube.
[0057] The width of the overlap increases with defocus, which increases the width of the Fresnel fringe. The defocus decreases to zero where the object and camera are on the same plane. In this condition, the object disappears and can't be seen because a fringe cannot be made as there is no overlap.
[0058] Additionally, the spatial resolution is determined by the width of the Fresnel fringe. The smallest width of the Fresnel fringe found in the image is the size of the virtual source size. The size of the virtual source is determined by the focusing ability of the emitter/lens assembly and the wavelength of the emitted beam from the emitter. The resolution can approach the wavelength of the acoustic beam, which for a 50 MHz acoustic beam approaches the size of the cell (lamda=frequency/speed=50,000,000 per second/1500 m/s=0.00003=30 microns). This is much higher resolution than ultrasound.
[0059] An image formed with a large defocus, i.e., broad fringe lines, can be processed with the processor to sharpen the features (i.e., the Fresnel fringes) in the image by applying a defocus amount, delta f, and knowing the cone angle, alpha of the beam such that the reduction in fringe width is delta f times the cone angle. Likewise, knowing the cone angle and the change in fringe width by a known or measured defocus can be used to determine the distance or position of the object, z, in the image, enabling a 3D image to be produced since the lateral dimensions, x, y, are already measurable in the image. This is one more advantage of a transmission image as produced by the present technology versus a reflected image (ultrasound). Ultrasound acquires z by a complex reflected time measurement of the reflected beam.
[0060] The distance between the virtual source 34 and the object 10 determines the magnification of the object. The further the virtual source 34 is to the object 10 the closer the magnification approaches one. The magnification of the object increases the closer the virtual source 34 approaches the object 10.
[0061] The spatial filter 46 and a temporal filter 54 (see
[0062] In an embodiment shown in
[0063] In order for the object 10 to be observed, the source 134 is moved inside the medium 132 by pivoting the focusing mirror 124 and the acoustic detector 140, by moving the device 110, or by repositioning the patient. If the object 110 is, for example, a disease, a mass, a tumour, a growth or the like, the coherent interference between the reflected beams 137 and the bypass beams 139 in the overlap (interference zone 141) is used to create the image of the object. The Fresnel fringe is what creates the image and is caused by defocusing, as described above.
[0064] As shown in
[0065] As shown in
[0066] As shown in
[0067] As shown in
[0068] In all embodiments, the method involves the acoustic emitter emitting 600 a coherent beam which is optionally reflected by the cone shaped reflector, which can be moved in and out by an actuator. The beam is focused by the focusing lens or mirror into a virtual source in an amorphous material or fluid. The virtual source is moved around so that it emits scattered beams of which some pass through an object, some bypass the object and some reflect off the object. The bypass beams and the reflected beams interfere with one another in an interference zone to provide an image which can be seen as a Fresnel Fringe, by defocusing the image.
[0069] While example embodiments have been described in connection with what is presently considered to be an example of a possible most practical and/or suitable embodiment, it is to be understood that the descriptions are not to be limited to the disclosed embodiments, but on the contrary, is intended to cover various modifications and equivalent arrangements included within the spirit and scope of the example embodiment. Those skilled in the art will recognize or be able to ascertain using no more than routine experimentation, many equivalents to the specific example embodiments specifically described herein. Such equivalents are intended to be encompassed in the scope of the claims, if appended hereto or subsequently filed.